Database Administrator
About Zeliot
At Zeliot, we are redefining the future of real-time data streaming, empowering enterprises and developers to unlock the full potential of their data with speed, simplicity, and scale. We envision a world where data moves seamlessly, insights are delivered instantly, and innovation happens at the speed of thought.
To bring this vision to life, we created Condense, a next-generation all-in-one data streaming platform that radically simplifies how real-time applications are built, deployed, and scaled. Condense eliminates the complexities traditionally associated with infrastructure management through fully managed Kafka, intelligent autoscaling, and a Bring Your Own Cloud (BYOC) deployment model, freeing developers from operational overhead and enabling them to focus entirely on creating new real-time experiences.
With an AI-driven development framework and a Custom Transformation Framework, Condense allows developers to write, test, and deploy stream processing logic in their preferred programming languages, accelerating innovation and shortening development cycles. This approach enables enterprises to bring new applications to market faster and operate at true cloud-native speed, supported by optimized infrastructure utilization and up to 40–60% reduction in total cost of ownership (TCO), all while maintaining data sovereignty, performance, and scalability across cloud environments.
Driven by deep domain expertise across connected mobility, IoT, and large-scale data ecosystems, Zeliot extends beyond platform innovation to deliver the complete ecosystem enterprises need to realize their real-time data ambitions. By combining advanced streaming technology with contextual intelligence and industry focus, Zeliot enables organizations to build, scale, and manage real-time applications with exceptional efficiency and measurable business impact.
At Zeliot, streaming becomes effortless, development becomes frictionless, and innovation becomes continuous, transforming how enterprises turn data into decisions and vision into value.
Location: Bangalore, India
Employment Type: Full-time
Role Summary:
We are seeking an experienced and performance-driven Database Administrator (DBA) to join our team. The ideal candidate will be responsible for ensuring optimal database utilization and writing complex, high-performance queries on large-scale IoT data. You will play a key role in maintaining database health, optimizing performance, and supporting advanced analytics use cases.

Key Responsibilities:
Database Performance Optimization:
  • Monitor and tune database performance, resource usage, and query execution plans.
  • Analyze and improve indexing strategies, partitioning, and caching mechanisms.
  • Identify and address performance bottlenecks in real-time data pipelines.
Complex Query Development:
  • Design, write, and optimize complex SQL queries for high-volume IoT datasets.
  • Collaborate with data engineers and analysts to support real-time and batch data processing.
  • Develop procedures and scripts for automation, data transformations, and reporting.
Database Management & Maintenance:
  • Maintain the availability, integrity, and security of databases (e.g., Big Query, PostgreSQL, MySQL, Timescale DB, Influx DB, etc.).
  • Plan and implement backup, restore, and disaster recovery strategies.
  • Support schema design and data modelling for large-scale time-series data.
Collaboration & Documentation:
  • Work closely with development, DevOps, and data teams to support business needs.
  • Maintain clear documentation of database configurations, structures, and queries.
  • Contribute to best practices and guidelines for database usage across the organization.

Qualifications/Skills:
  • 3+ years of experience as a DBA or in a similar role, preferably with large-scale IoT or time-series data.
  • Strong expertise in SQL and query optimization.
  • Hands-on experience with performance tuning and monitoring tools.
  • Familiarity with time-series databases (e.g., Influx DB, Timescale DB) is a strong plus.
  • Experience with Big-Query and Python is a strong plus.
  • Solid understanding of indexing, sharding, replication, and partitioning.
  • Experience with database security and backup strategies.
  • Familiarity with cloud database solutions (AWS RDS, Azure SQL, GCP) is a plus.
  • Strong problem-solving skills and attention to detail.

Good to have skills:
  • Experience working with streaming platforms (Kafka, MQTT, etc.).
  • Exposure to big data ecosystems (e.g., Hadoop, Spark).
  • Scripting skills (e.g., Python, Bash) for automation tasks.
  • Knowledge of DevOps practices and infrastructure-as-code (e.g., Terraform).

What We Offer
  • Competitive compensation and comprehensive benefits.
  • Opportunity to work on a cutting-edge real-time data platform.
  • High ownership, autonomy, and impact.
  • Collaborative, fast-paced, deep-tech environment.
  • Strong focus on learning, growth, and long-term career development.

Want to build the future of real-time systems?
Join Zeliot and help shape how enterprises turn streaming data into real-world impact.