Description
10+ years experiece required
– Proven ability to lead and manage larger teams in different GEO regions.
– Should have a clear understanding and working experience in SDLC, CI/CD, Agile & Sprint programs.
– Ability to Co-ordination with Application and Platform teams, Hyper-Care support, Performance optimization, Unit Test & Integration test support
– Mentor and support other colleagues and L2 team for L3 Technical delivery
– Managing services SLAs like availability (proactive monitoring) and incident resolutions as per SLAs
– Strong client-facing role at the customer location or remotely.
DevOps & Cloud Infrastructure Management
Data Visualization & BI Tool Integration
Job Summary:
We are seeking a skilled Middleware EDH Consultant with expertise in DataStax to design, implement, and manage enterprise data integration and distribution solutions. The ideal candidate will have hands-on experience with DataStax Enterprise (DSE), Apache Cassandra, and middleware technologies to support real-time and batch data flows across complex enterprise systems.
________________________________________
Key Responsibilities:
• Design and implement Enterprise Data Hub (EDH) solutions using DataStax Enterprise and related technologies.
• Develop and maintain middleware components for data ingestion, transformation, and distribution.
• Integrate EDH with upstream and downstream systems including ERP, CRM, billing, and analytics platforms.
• Optimize data models and queries for performance and scalability in Cassandra-based environments.
• Ensure data consistency, reliability, and security across distributed systems.
• Monitor and troubleshoot middleware and EDH components to ensure high availability.
• Collaborate with data architects, application developers, and infrastructure teams.
• Document technical designs, data flows, and operational procedures.
________________________________________
Required Skills & Qualifications:
• Strong experience with DataStax Enterprise (DSE) and Apache Cassandra.
• Proficiency in middleware integration, data pipelines, and real-time data processing.
• Experience with Kafka, Spark, REST APIs, and ETL tools.
• Solid understanding of distributed systems, NoSQL databases, and data replication strategies.
• Familiarity with DevOps tools, CI/CD pipelines, and containerization (Docker/Kubernetes).
• Knowledge of data governance, security, and compliance frameworks.
• Excellent problem-solving and communication skills.
________________________________________





