Loading...

Avathon

Software Engineer (Python)

Avathon
Bengaluru, Karnataka On-site
On-Site Full-Time Bengaluru, Karnataka India

About the Role

Who We Are & Why Join Us

Avathon is revolutionizing industrial AI with a powerful platform that enables businesses to harness the full potential of their operational data. Our technology seamlessly integrates and contextualizes siloed datasets, providing a 360-degree operational view that enhances decision-making and efficiency. With advanced capabilities like digital twins, natural language processing, normal behavior modeling, and machine vision, we create real-time virtual replicas of physical assets, enabling predictive maintenance, performance simulation, and operational optimization. Our AI-driven models empower companies with scalable solutions for anomaly detection, performance forecasting, and asset lifetime extension—all tailored to the complexities of industrial environments.
Cutting-Edge AI Innovation – Join a team at the forefront of AI, developing groundbreaking solutions that shape the future.
High-Growth Environment – Thrive in a fast-scaling startup where agility, collaboration, and rapid professional growth are the norm.
Meaningful Impact – Work on AI-driven projects that drive real change across industries and improve lives.
Learn more at: Avathon

Job Title: Backend Engineer (Python Data Platform Focus)
Location: Bengaluru (Gaurcharpalya Metro)
Work Mode: 4 Days in office
Interview process: Online test, Virtual and one Face to Face discussion.
Experience: 5 to 7 years

Overview:
We are looking for a seasoned Backend Engineer with strong Python expertise to join our team in building scalable, high-performance systems that power data-intensive applications. This role is ideal for someone who thrives in designing backend architectures that handle massive read/write operations and is passionate about building resilient, distributed systems.

Key Responsibilities:
Design and implement scalable backend services using Python frameworks such as Django and FastAPI.
Build and maintain high-throughput APIs and data pipelines optimized for performance and reliability.
Architect systems that support large-scale data ingestion, processing, and querying with minimal latency.
Work with relational and non-relational databases (e.g., SQL, PostgreSQL, MongoDB, Redis) to ensure data consistency and performance under heavy load.
Collaborate with DevOps to containerize services using Docker and orchestrate deployments with Kubernetes.
Implement caching, sharding, and queuing strategies to optimize system responsiveness and throughput.
Ensure code quality through unit testing, integration testing, and adherence to clean code principles.
Participate in system design discussions, code reviews, and performance tuning sessions.
Collaborate with data engineers, analysts, and frontend developers to deliver end-to-end solutions.
Monitor and troubleshoot production systems, ensuring high availability and fault tolerance.

Required Qualifications:
Education: Bachelor’s degree in Computer Science, Engineering, or equivalent practical experience.
Experience: 5-6 years of backend development experience with a strong focus on Python.
Proficiency in Django & FastAPI for building RESTful APIs and backend services.
Experience designing systems for high-volume data environments (e.g., analytics platforms, event-driven systems).
Solid understanding of database internals, indexing, and query optimization for both SQL and NoSQL systems.
Familiarity with message brokers and stream processing tools (e.g., Kafka, RabbitMQ, Celery).
Experience with containerization (Docker) and orchestration (Kubernetes) in production environments.
Strong grasp of software engineering principles, including scalability, fault tolerance, and observability.
Proficient with Git-based workflows and CI/CD pipelines.
Excellent problem-solving skills and ability to work collaboratively in cross-functional teams.

Nice to Have:
Experience with time-series databases or OLAP systems (e.g., InfluxDB, Apache Druid).
Exposure to cloud-native architectures on AWS, GCP, or Azure.
Familiarity with data lake or data warehouse technologies.

Apply for this position

Log in or Sign up to Apply

Access the application form by logging in or creating an account.

Application Status

Application Draft

In Progress

Submit Application

Pending

Review Process

Expected within 5-7 days

Similar Jobs