
On-Site
Full-Time
Hyderabad, Telangana
India
Skills
Amazon Web Services (AWS)
Amazon S3
About the Role
Job Summary:
We are seeking a highly experienced and motivated Lead Developer with a strong background in Java, Big Data technologies, and AI to lead and mentor a team of talented engineers. You will be responsible for designing, developing, and implementing cutting-edge solutions leveraging the power of Big Data and AI to solve complex business challenges. This role requires a deep understanding of data processing, data warehousing, and machine learning principles, along with hands-on experience in developing and deploying applications in cloud environments like AWS and GCP. You will be a technical leader, driving best practices, providing mentorship, and ensuring the successful delivery of high-quality, scalable, and maintainable solutions.
Responsibilities:
Technical Leadership & Mentorship:
Provide technical leadership and guidance to a team of developers, fostering a collaborative and innovative environment.
Mentor and coach team members on best practices, coding standards, design patterns, and emerging technologies.
Conduct code reviews and provide constructive feedback to ensure code quality and adherence to standards.
Drive technical discussions and contribute to architectural decisions.
Solution Design & Development:
Design and develop scalable and robust Big Data solutions using Hadoop, HDFS, Spark, Scala, and Python.
Build and maintain data pipelines for data ingestion, processing, and storage.
Develop and deploy applications using Java full-stack technologies.
Design and implement data warehouses and data lakes using SQL technologies such as BigQuery, Athena, Spark SQL, and Snowflake.
Develop and implement machine learning models using Python and associated libraries (e.g., scikit-learn, TensorFlow, PyTorch).
Develop analytics dashboards and reports using Python and related libraries.
Cloud Architecture & Deployment:
Design and implement solutions in cloud environments (AWS and GCP) leveraging cloud-native services.
Optimize infrastructure and applications for performance, scalability, and cost-efficiency in the cloud.
Implement and maintain CI/CD pipelines for automated build, test, and deployment.
Collaboration & Communication:
Work closely with product managers, data scientists, and other stakeholders to understand requirements and translate them into technical solutions.
Communicate technical concepts clearly and effectively to both technical and non-technical audiences.
Participate in agile development processes, including sprint planning, daily stand-ups, and retrospectives.
Innovation & Continuous Learning:
Stay up-to-date with the latest trends and technologies in Big Data, AI, and cloud computing.
Identify opportunities to leverage new technologies to improve existing systems and processes.
Contribute to the development of internal tools and frameworks.
Data Analysis and Reporting
Creating insightful data visualization with Python (Matplotlib, Seaborn)
Work with product team to get deep insight of data and create report
Have good Idea on R language for reporting
AI model
Good understand on AI model and implementation of model
Qualifications:
Experience: 10+ years of experience in software development, with a focus on Big Data and AI technologies.
Programming Languages: Strong proficiency in Java, Scala, Python, and SQL.
Big Data Technologies: Deep understanding and hands-on experience with Hadoop, HDFS, Spark, and related technologies.
Data Warehousing: Experience with data warehousing solutions such as BigQuery, Athena, Spark SQL, and Snowflake.
Full-Stack Development: Proven experience with Java full-stack development, including backend technologies and front-end frameworks.
Cloud Computing: Hands-on experience with AWS and/or GCP, including services such as EC2, S3, Lambda, BigQuery, Dataflow, etc.
AI/ML: Experience with machine learning algorithms and frameworks (e.g., scikit-learn, TensorFlow, PyTorch).
Data Analysis & Reporting: Good understating on data visualization tool with python. Understanding of R is good to have.
AI: Good understanding of AI models and their implementation.
DevOps: Experience with CI/CD pipelines and automation tools.
Leadership Skills: Proven ability to lead and mentor a team of developers.
Communication Skills: Excellent written and verbal communication skills.
Education: Bachelor's degree in Computer Science or a related field.
Bonus Points:
Experience with specific industries (e.g., e-commerce, finance, healthcare).
Contributions to open-source projects.
Certifications in AWS or GCP.
Master's degree in Computer Science or a related field.
We are seeking a highly experienced and motivated Lead Developer with a strong background in Java, Big Data technologies, and AI to lead and mentor a team of talented engineers. You will be responsible for designing, developing, and implementing cutting-edge solutions leveraging the power of Big Data and AI to solve complex business challenges. This role requires a deep understanding of data processing, data warehousing, and machine learning principles, along with hands-on experience in developing and deploying applications in cloud environments like AWS and GCP. You will be a technical leader, driving best practices, providing mentorship, and ensuring the successful delivery of high-quality, scalable, and maintainable solutions.
Responsibilities:
Technical Leadership & Mentorship:
Provide technical leadership and guidance to a team of developers, fostering a collaborative and innovative environment.
Mentor and coach team members on best practices, coding standards, design patterns, and emerging technologies.
Conduct code reviews and provide constructive feedback to ensure code quality and adherence to standards.
Drive technical discussions and contribute to architectural decisions.
Solution Design & Development:
Design and develop scalable and robust Big Data solutions using Hadoop, HDFS, Spark, Scala, and Python.
Build and maintain data pipelines for data ingestion, processing, and storage.
Develop and deploy applications using Java full-stack technologies.
Design and implement data warehouses and data lakes using SQL technologies such as BigQuery, Athena, Spark SQL, and Snowflake.
Develop and implement machine learning models using Python and associated libraries (e.g., scikit-learn, TensorFlow, PyTorch).
Develop analytics dashboards and reports using Python and related libraries.
Cloud Architecture & Deployment:
Design and implement solutions in cloud environments (AWS and GCP) leveraging cloud-native services.
Optimize infrastructure and applications for performance, scalability, and cost-efficiency in the cloud.
Implement and maintain CI/CD pipelines for automated build, test, and deployment.
Collaboration & Communication:
Work closely with product managers, data scientists, and other stakeholders to understand requirements and translate them into technical solutions.
Communicate technical concepts clearly and effectively to both technical and non-technical audiences.
Participate in agile development processes, including sprint planning, daily stand-ups, and retrospectives.
Innovation & Continuous Learning:
Stay up-to-date with the latest trends and technologies in Big Data, AI, and cloud computing.
Identify opportunities to leverage new technologies to improve existing systems and processes.
Contribute to the development of internal tools and frameworks.
Data Analysis and Reporting
Creating insightful data visualization with Python (Matplotlib, Seaborn)
Work with product team to get deep insight of data and create report
Have good Idea on R language for reporting
AI model
Good understand on AI model and implementation of model
Qualifications:
Experience: 10+ years of experience in software development, with a focus on Big Data and AI technologies.
Programming Languages: Strong proficiency in Java, Scala, Python, and SQL.
Big Data Technologies: Deep understanding and hands-on experience with Hadoop, HDFS, Spark, and related technologies.
Data Warehousing: Experience with data warehousing solutions such as BigQuery, Athena, Spark SQL, and Snowflake.
Full-Stack Development: Proven experience with Java full-stack development, including backend technologies and front-end frameworks.
Cloud Computing: Hands-on experience with AWS and/or GCP, including services such as EC2, S3, Lambda, BigQuery, Dataflow, etc.
AI/ML: Experience with machine learning algorithms and frameworks (e.g., scikit-learn, TensorFlow, PyTorch).
Data Analysis & Reporting: Good understating on data visualization tool with python. Understanding of R is good to have.
AI: Good understanding of AI models and their implementation.
DevOps: Experience with CI/CD pipelines and automation tools.
Leadership Skills: Proven ability to lead and mentor a team of developers.
Communication Skills: Excellent written and verbal communication skills.
Education: Bachelor's degree in Computer Science or a related field.
Bonus Points:
Experience with specific industries (e.g., e-commerce, finance, healthcare).
Contributions to open-source projects.
Certifications in AWS or GCP.
Master's degree in Computer Science or a related field.