0 Negotiable or Not Mentioned
India, Bangalore
5 days ago
costaffglobal.in
253 Views
We are seeking an experienced Java Full Stack Developer to join our dynamic engineering team in Bangalore. In this role, you will be responsible for designing, developing, and maintaining scalable microservices-based applications using modern technologies. You will work on building robust backend systems using Java and Spring Boot while also developing responsive front-end interfaces using React or Angular to provide a seamless user experience. Furthermore, the ideal candidate will have 6 to 12 years of experience and a deep understanding of Apache Kafka for event-driven architecture and Kubernetes for container orchestration. You will collaborate with cross-functional teams in an agile environment to ensure high-quality code through rigorous testing and peer reviews. Joining our team offers the opportunity to work on cutting-edge technologies and high-impact projects that reach a global scale within a collaborative and growth-focused environment.
Key Requirements
Minimum of 6 to 12 years of professional experience in software development
Expert-level proficiency in Java and the Spring Boot framework
Demonstrated hands-on experience with Microservices Architecture
Deep expertise in Apache Kafka or other high-throughput messaging systems
Experience containerizing applications using Docker and orchestrating with Kubernetes
Proficiency in frontend technologies including React, Angular, or modern JavaScript
Strong understanding of RESTful API design and implementation
Familiarity with CI/CD pipelines and DevOps best practices
Experience working with cloud infrastructure such as AWS, Microsoft Azure, or GCP
Proven ability to collaborate effectively within an agile development team
0 Negotiable or Not Mentioned
India, Bangalore
6 days ago
gmail.com
447 Views
We are looking for passionate Full Stack Software Engineers to join our growing team in Bangalore. In this role, you will work on modern, scalable applications using technologies like C# .NET, Python, and React. You will be part of a dynamic team focused on building high-quality software and collaborating with cross-functional teams to deliver impactful cloud-based solutions. This opportunity is ideal for developers who enjoy working in a cloud-centric environment and contributing to the complete lifecycle of software products.
Your responsibilities will include developing APIs, microservices, and integration layers while creating intuitive, high-performance user interfaces. You will contribute across the full Software Development Life Cycle (SDLC) from design and development to deployment on Azure. Additionally, you will work on improving CI/CD pipelines and optimizing system performance to ensure robust and efficient delivery. Candidates will also have the chance to explore AI-based solutions and modern containerization technologies to further enhance their technical skill set.
Key Requirements
Experience in C# .NET or Python development for backend services.
Proficiency in React for building dynamic front-end user interfaces.
Strong knowledge of building and designing scalable enterprise applications.
Experience developing REST APIs, microservices, and integration layers.
Familiarity with Azure cloud services for deployment and scaling.
Deep understanding of databases and data management strategies.
Proficiency with Git, CI/CD pipelines, and general DevOps practices.
Strong problem-solving abilities and clear communication skills.
Knowledge of containerization technologies like Docker and Kubernetes.
Experience with the full Software Development Life Cycle (SDLC).
0 Negotiable or Not Mentioned
India, Bangalore
10 days ago
fxconsulting.in
774 Views
We are seeking a highly skilled Technical Lead for Data Engineering to join our dynamic team in Bangalore. This role is centered on building and scaling high-performance data systems that support our product-driven initiatives. As a lead, you will be at the forefront of designing scalable ETL pipelines and leveraging technologies such as Spark, Hadoop, and Kafka for large-scale data processing. Your expertise will ensure that our data infrastructure is robust, efficient, and capable of handling complex data workloads.
In addition to your technical responsibilities, you will provide leadership to the engineering team and work collaboratively with Data Scientists to optimize data models and ensure top-tier data quality and security. You will be expected to monitor and troubleshoot data pipelines while maintaining high standards for data governance. The ideal candidate brings 6 to 9 years of experience, a strong background in Python or Scala, and a deep understanding of cloud platforms like AWS, Azure, or GCP. This is a fantastic opportunity for a professional looking to lead engineering excellence in a fast-paced environment.
Key Requirements
6 to 9 years of professional experience in Data Engineering.
Proven expertise in Spark and other Big Data technologies.
Proficiency in coding with Python, Scala, or Java.
Extensive experience in developing and optimizing ETL pipelines.
Hands-on experience with cloud platforms such as AWS, Azure, or GCP.
Strong knowledge of Hadoop and Kafka for large-scale data processing.
Demonstrated experience in team handling and leadership roles.
Ability to design and optimize complex data models.
Understanding of data quality, governance, and security principles.
Exceptional problem-solving skills and ability to work in fast-paced environments.
0 Negotiable or Not Mentioned
India, Bangalore
10 days ago
refex.co.in
873 Views
RGML is seeking a proactive and experienced DevOps Engineer to manage and scale our new AWS-based cloud architecture. This role is central to building a secure, fault-tolerant, and highly available environment that supports our Sun, Drive, and Comm platforms. You will be responsible for designing and implementing infrastructure using various AWS services across multiple Availability Zones, ensuring the platform remains robust and scalable. In this position, you'll play a critical role in automation, deployment pipelines, monitoring, and cloud cost optimization.
Key tasks include maintaining EC2 Auto Scaling Groups, managing API Gateways, and optimizing Aurora SQL Clusters with multi-AZ failover strategies. You will also enforce infrastructure-as-code practices and collaborate with engineering teams to enable DevSecOps best practices, driving the transformation of legacy systems into modern, scalable infrastructure. You will be working on mission-critical mobility platforms with a growing user base, offering a collaborative and fast-paced environment where you can drive automation and shape future DevSecOps practices.
Key Requirements
Strong hands-on experience with AWS core services: EC2 (Linux and Windows), ALB, VPC, S3, Aurora, CloudWatch, API Gateway, IAM, and VPN.
Deep understanding of multi-AZ, high availability, and auto-healing architectures.
Experience with CI/CD tools such as GitHub Actions, Jenkins, or CodePipeline and scripting in Bash, Python, or Shell.
Working knowledge of networking and cloud security best practices including Security Groups, NACLs, and IAM roles.
Experience with Bastion architecture, Client VPNs, Route 53, and VPC peering.
Familiarity with backup and restore strategies and monitoring/logging pipelines.
Proven ability to implement and maintain infrastructure-as-code practices using Terraform or CloudFormation.
Ability to design and manage infrastructure across multiple Availability Zones to ensure fault tolerance.
Experience maintaining and scaling EC2 Auto Scaling Groups and Application Load Balancers.
Proficiency in setting up and optimizing Aurora SQL Clusters with multi-AZ active-active failover strategies.
0 Negotiable or Not Mentioned
India, Bengaluru
9 days ago
scienstechnologies.com
650 Views
Sciens Technologies is seeking a dedicated Database Support Engineer with specialized expertise in AWS database services to join our team in Bengaluru. This role is pivotal in managing and supporting a variety of AWS databases, including RDS, Aurora, DynamoDB, DocumentDB, and ElastiCache. As part of a global "follow-the-sun" model, you will be responsible for ensuring the high availability, reliability, and performance of mission-critical systems. This involves active participation in incident management, root cause analysis (RCA), and providing 24/7 production support to maintain seamless operations for our global clients.
Beyond routine maintenance, the ideal candidate will drive performance optimization through query tuning and strategic indexing. You will also leverage automation using Python or Bash to streamline database operations and enhance system efficiency. Collaboration is a key component of this role, as you will work closely with global teams to maintain comprehensive documentation and runbooks. Monitoring system health using tools like CloudWatch, Prometheus, and Grafana will be part of your daily activities to ensure proactive issue resolution and disaster recovery preparedness. Candidates with a background in Kubernetes and Terraform are highly encouraged to apply.
Key Requirements
3–5 years of professional Database Administration experience.
Strong hands-on experience specifically with AWS RDS and Aurora services.
Profound knowledge of backup/recovery, High Availability (HA), and Disaster Recovery (DR) strategies.
Proficiency in scripting languages such as Python or Bash for operational automation.
Hands-on experience with monitoring and alerting tools like CloudWatch, Prometheus, or Grafana.
Proven ability to handle incident management and root cause analysis (RCA) in a production environment.
Expertise in SQL query tuning, indexing strategies, and database performance optimization.
Familiarity with AWS Database Migration Service (DMS) and general migration strategies.
Knowledge of Kubernetes (EKS) and managing containerized database environments.
Understanding of Infrastructure as Code (IaC) principles using Terraform or CloudFormation.
0 Negotiable or Not Mentioned
India, Bangalore Whitefield
10 days ago
infogain.com
962 Views
Infogain is currently seeking a highly skilled Java Developer or Lead professional to join our dynamic team in Bangalore Whitefield. The successful candidate will be responsible for designing, developing, and maintaining high-quality software solutions using Java, Spring Boot, and Microservices architecture. This role requires a strong technical background and the ability to work in a fast-paced environment to deliver robust and scalable applications that meet business requirements.
In this role, you will collaborate with cross-functional teams to identify technical needs and translate them into functional software components. You will be involved in the full software development lifecycle, from initial design and coding to testing and deployment. Proficiency in SQL and Hibernate is essential for managing data persistence and optimization. We are looking for individuals who are passionate about technology and committed to continuous improvement and innovation within the enterprise software space.
Key Requirements
Strong experience in Java programming and development.
Proficiency in the Spring Boot framework.
Hands-on experience with Hibernate ORM.
Advanced knowledge of SQL and database management.
Expertise in designing and implementing Microservices architecture.
Proven experience in lead roles or mentoring junior developers.
Ability to design and implement scalable backend services.
Knowledge of RESTful API development and integration.
Familiarity with Agile development methodologies and workflows.
Excellent problem-solving and analytical skills.
0 Negotiable or Not Mentioned
India, Chennai
8 days ago
theclosinggap.net
576 Views
The Technical Project Manager role at The Closing Gap is a unique position designed for individuals who possess deep technical expertise and strong delivery leadership within the insurance domain. The role requires a balanced 50/50 split between hands-on coding and high-level project management, specifically focusing on building impactful solutions for P&C, Life, and Health insurance sectors. You will be responsible for leading complex tech projects, ensuring high health scores for technical services, and maintaining superior code quality across all delivery phases. In this full-time role based in Chennai, India, you will navigate the entire project lifecycle including technical analysis, design, building, and final delivery. Your work will involve significant migration and transformation projects where you will manage client-facing stakeholders and own the end-to-end delivery process. By driving sprint progress and adhering to technical roadmaps, you will ensure that technical expertise meets leadership to build insurance solutions that truly matter.
Key Requirements
Minimum 10 to 15 years of professional experience in technical delivery and project management.
Proven hands-on experience with Java and Spring Boot framework for backend development.
Strong expertise in Microservices architecture and implementation.
Deep domain knowledge in Insurance sectors including P&C, Life, or Health.
Ability to balance 50% active coding with 50% project management duties.
Extensive experience in leading migration and transformation projects.
Excellent client-facing stakeholder management and communication skills.
Demonstrated ownership of end-to-end project delivery and sprint progress.
Proficiency in technical design, code quality analysis, and service health monitoring.
Strong leadership skills to guide technical teams while staying hands-on with the codebase.
0 Negotiable or Not Mentioned
India, Bangalore
5 days ago
careernet.in
358 Views
My client within the Pharmaceutical sector is looking to expand its technology hub located in Bangalore. We are seeking high-impact Senior AWS Data Engineers who are ready to build scalable data platforms and implement cutting-edge solutions. This role is crucial for managing the infrastructure that supports data-driven decision-making in the pharmaceutical industry and ensuring that large-scale data assets are accessible and reliable. The successful candidate will work extensively with AWS Glue, Lambda, and Databricks. You will be responsible for data modelling and processing using Python, PySpark, and SQL. This is a 100% work-from-office position in Bangalore, requiring candidates who are either currently serving their notice period or can join immediately within 30 days. Your expertise will directly contribute to the innovation of data architectures in a fast-paced environment.
Key Requirements
6–12 years of professional experience in data engineering
Expertise in AWS Glue and AWS Lambda for serverless computing
Proficiency in Databricks for unified analytics and data processing
Strong programming skills in Python for data manipulation
Advanced knowledge of PySpark for big data processing tasks
Hands-on experience with SQL for complex database queries
Proven track record in Data Modelling and architectural design
Experience in the pharmaceutical or life sciences sector
Ability to build and maintain scalable data platforms
Strong analytical and problem-solving skills in a cloud environment
0 Negotiable or Not Mentioned
India, Bengaluru
8 days ago
huemot.com
418 Views
We are seeking a highly experienced Data Engineering Lead to spearhead a critical engagement within our Capital Markets practice. Based in Bengaluru, this role involves supporting a prominent Private Equity firm headquartered in New York. The successful candidate will oversee the development and maintenance of high-impact data pipelines and lakehouse architectures using cutting-edge technologies. You will work closely with stakeholders to translate business requirements into technical specifications, ensuring high data quality and system reliability across the enterprise.
You will be responsible for leading an offshore team of 5 to 7 engineers, ensuring the delivery of production-grade data solutions through mentorship and technical oversight. This position requires deep expertise in Azure Databricks and PySpark, along with a solid understanding of data governance through Unity Catalog. Candidates must possess a strong background in U.S. Capital Markets or Private Equity to effectively meet the complex data needs of our clients. Successful applicants will demonstrate a history of architectural excellence and the ability to navigate complex financial data landscapes.
Key Requirements
15+ years of enterprise data engineering experience
Databricks Certified Data Engineer (mandatory certification)
5+ years of hands-on experience specifically on Azure Databricks
5+ years of hands-on PySpark experience with production-grade pipelines
Strong knowledge of Unity Catalog and data governance frameworks
Proven experience leading offshore teams of 5–7 engineers
Domain experience in U.S. Capital Markets, Private Equity, or Investment Management
Expertise in lakehouse architecture and modern data stack design
Advanced proficiency in SQL for complex data transformations
Strong understanding of CI/CD practices for automated data pipelines
0 Negotiable or Not Mentioned
India, Bangalore
5 days ago
careernet.in
358 Views
We are looking for an experienced AI/ML Developer to join our growing technology hub in Bangalore. This role is tailored for professionals with a strong background in artificial intelligence and machine learning, particularly within the pharmaceutical sector. You will be at the forefront of building cutting-edge AI solutions and integrating advanced generative models into our data ecosystems. The role requires a high degree of expertise in Python, TensorFlow, and Generative AI frameworks. Mandatory skills include Fast API, with a strong preference for candidates who have experience in Agentic AI and modern Large Language Models such as OpenAI or Claude. This position is a 100% office-based role in Bangalore, and we are specifically looking for candidates who are on their notice period and can join within 30 days. You will be instrumental in deploying scalable AI models that drive technological advancement for our global clients.
Key Requirements
5–10 years of overall experience in software development
Minimum 4 years of dedicated experience in AI/ML projects
Strong proficiency in Python for machine learning workflows
Hands-on experience with TensorFlow or similar deep learning frameworks
Proven expertise in Generative AI technologies and applications
Proficiency in building and deploying APIs using Fast API
Working knowledge of AWS cloud services for AI infrastructure
Experience with Agentic AI or AI agents development
Familiarity with OpenAI or Claude LLM integrations
Ability to optimize machine learning models for production performance