0 Negotiable or Not Mentioned
India, Bangalore
10 days ago
fxconsulting.in
774 Views
We are seeking a highly skilled Technical Lead for Data Engineering to join our dynamic team in Bangalore. This role is centered on building and scaling high-performance data systems that support our product-driven initiatives. As a lead, you will be at the forefront of designing scalable ETL pipelines and leveraging technologies such as Spark, Hadoop, and Kafka for large-scale data processing. Your expertise will ensure that our data infrastructure is robust, efficient, and capable of handling complex data workloads.
In addition to your technical responsibilities, you will provide leadership to the engineering team and work collaboratively with Data Scientists to optimize data models and ensure top-tier data quality and security. You will be expected to monitor and troubleshoot data pipelines while maintaining high standards for data governance. The ideal candidate brings 6 to 9 years of experience, a strong background in Python or Scala, and a deep understanding of cloud platforms like AWS, Azure, or GCP. This is a fantastic opportunity for a professional looking to lead engineering excellence in a fast-paced environment.
Key Requirements
6 to 9 years of professional experience in Data Engineering.
Proven expertise in Spark and other Big Data technologies.
Proficiency in coding with Python, Scala, or Java.
Extensive experience in developing and optimizing ETL pipelines.
Hands-on experience with cloud platforms such as AWS, Azure, or GCP.
Strong knowledge of Hadoop and Kafka for large-scale data processing.
Demonstrated experience in team handling and leadership roles.
Ability to design and optimize complex data models.
Understanding of data quality, governance, and security principles.
Exceptional problem-solving skills and ability to work in fast-paced environments.
0 Negotiable or Not Mentioned
India, Thiruvananthapuram
27 days ago
ampcustech.com
1398 Views
As a Senior Backend Java Developer, you will be responsible for driving complex SaaS integration projects and architecting scalable microservices-based solutions. You will collaborate extensively with customers, partners, and product teams to define integration landscapes and execute onboarding strategies. This role is pivotal in delivering seamless integrations and ensuring high-quality deployments that meet the rigorous standards of modern cloud-native environments.
You will also focus on optimizing project delivery timelines by enhancing processes and documentation. Your technical expertise will be applied to supporting cloud infrastructure and troubleshooting critical system issues within client environments. This position offers a unique opportunity to work on large-scale transformation projects, leveraging distributed systems, Kafka, and modern cloud frameworks in a highly collaborative and fast-paced setting.
Key Requirements
5–10 years of professional application and data integration experience
Proven experience in at least one end-to-end SaaS implementation project
Strong expertise in microservices architecture and integration design principles
Advanced knowledge of REST, gRPC, message queues, and API-based integrations
Hands-on experience with cloud platforms including Azure, AWS, or GCP
Direct experience with Kafka Connect Framework and various connectors (HTTP, REST, JMS)
Deep understanding of clustering and high availability configurations
Strong knowledge of network protocols and security within integration stacks
Extensive experience working in an Agile development environment
Excellent communication, analytical, and troubleshooting skills
0 Negotiable or Not Mentioned
India, Bangalore
5 days ago
costaffglobal.in
358 Views
We are seeking an experienced Java Full Stack Developer to join our dynamic engineering team in Bangalore. In this role, you will be responsible for designing, developing, and maintaining scalable microservices-based applications using modern technologies. You will work on building robust backend systems using Java and Spring Boot while also developing responsive front-end interfaces using React or Angular to provide a seamless user experience. Furthermore, the ideal candidate will have 6 to 12 years of experience and a deep understanding of Apache Kafka for event-driven architecture and Kubernetes for container orchestration. You will collaborate with cross-functional teams in an agile environment to ensure high-quality code through rigorous testing and peer reviews. Joining our team offers the opportunity to work on cutting-edge technologies and high-impact projects that reach a global scale within a collaborative and growth-focused environment.
Key Requirements
Minimum of 6 to 12 years of professional experience in software development
Expert-level proficiency in Java and the Spring Boot framework
Demonstrated hands-on experience with Microservices Architecture
Deep expertise in Apache Kafka or other high-throughput messaging systems
Experience containerizing applications using Docker and orchestrating with Kubernetes
Proficiency in frontend technologies including React, Angular, or modern JavaScript
Strong understanding of RESTful API design and implementation
Familiarity with CI/CD pipelines and DevOps best practices
Experience working with cloud infrastructure such as AWS, Microsoft Azure, or GCP
Proven ability to collaborate effectively within an agile development team
0 Negotiable or Not Mentioned
India, Coimbatore
15 days ago
infolexus.com
1779 Views
Infolexus is currently recruiting on behalf of a prominent client in the Information Technology sector. This role is designed for a DevOps Engineer who is eager to contribute to a dynamic team environment and work with cutting-edge cloud technologies. The successful candidate will be responsible for building, maintaining, and optimizing scalable systems, ensuring high availability and performance across various platforms. This position offers an excellent opportunity for both freshers and experienced professionals to grow their careers within a forward-thinking organization.
As a DevOps Engineer based in Coimbatore, you will work closely with development and operations teams to streamline deployment processes and enhance infrastructure. You will utilize your knowledge of Linux systems, cloud environments like AWS or Azure, and automation tools to drive efficiency. The role involves managing CI/CD pipelines, writing scripts to automate repetitive tasks, and troubleshooting complex technical issues. Join a team that values innovation and technical excellence while pushing the boundaries of modern IT infrastructure.
Key Requirements
Basic knowledge of Linux/Unix systems
Familiarity with cloud platforms (AWS, Azure, or GCP)
Understanding of CI/CD tools (Jenkins, GitHub Actions)
Knowledge of scripting (Python, Bash, or similar)
0–3 years of experience in a relevant technical role
Ability to build and maintain scalable systems
Eagerness to work with modern cloud technologies
Strong analytical and problem-solving skills
Excellent communication and collaboration abilities
Proactive approach to learning new DevOps tools and methodologies
0 Negotiable or Not Mentioned
India, Chennai
16 days ago
epam.com
1062 Views
EPAM Systems is looking for talented and passionate Java Developers to join our dynamic team in Chennai. This role involves working on high-impact projects, collaborating with innovative professionals across the globe, and utilizing cutting-edge technologies to solve complex business problems. We value individuals who are eager to grow their careers and contribute to a fast-paced, collaborative environment.
The successful candidate will be responsible for designing, developing, and maintaining scalable applications using Core Java, Spring, and Microservices. You will also have the opportunity to work with modern front-end frameworks like React JS or Angular and deploy solutions on major cloud platforms including AWS, Azure, and GCP. If you have over six years of experience and are ready for a new challenge, we encourage you to join our growing team.
Key Requirements
Minimum of 6 years of experience in Java development.
Strong proficiency in Core Java programming and principles.
Hands-on experience with Spring and Spring Boot frameworks.
Proven expertise in using Hibernate for ORM.
Solid understanding and implementation experience of RESTful APIs.
Working knowledge of Microservices architecture.
Proficiency in front-end technologies like React JS or Angular.
Experience with cloud service providers such as AWS, Azure, or GCP.
Ability to join within a notice period ranging from 0 to 90 days.
Strong problem-solving skills and ability to work in a collaborative team environment.
0 Negotiable or Not Mentioned
India, Chennai
16 days ago
logfixscm.com
1143 Views
We are seeking a highly skilled and experienced Cloud Engineer to join our client's team on a contract basis. The ideal candidate will have substantial hands-on experience in various cloud platforms, focusing on building and maintaining scalable, high-performance systems. This role is designed for a professional who is passionate about infrastructure management and delivering reliable, innovative solutions within a dynamic and fast-paced technological environment. As a Cloud Engineer, you will be responsible for managing cloud infrastructure, ensuring system reliability, and optimizing performance. You will work closely with cross-functional teams to implement best practices in cloud computing and automation. The position requires a notice period of immediate to 15 days, making it a great opportunity for those ready to transition quickly into a significant technical role in Chennai. The role involves managing complex infrastructure and ensuring that all cloud-based services are secure and scalable.
Key Requirements
7–8 years of professional experience in cloud engineering or related roles.
Proficiency in cloud platforms such as AWS, Azure, or GCP.
Strong experience with infrastructure management and automation tools.
Knowledge of CI/CD pipelines and DevOps methodologies.
Expertise in designing scalable and high-performance system architectures.
Proficiency in scripting languages like Python or Bash for automation purposes.
Hands-on experience with containerization technologies such as Docker and Kubernetes.
Ability to manage and monitor cloud security and compliance protocols effectively.
Proven troubleshooting and analytical problem-solving skills in cloud environments.
Excellent communication and collaborative skills for working in cross-functional teams.
0 Negotiable or Not Mentioned
India, Bengaluru
8 days ago
huemot.com
591 Views
We are seeking a highly experienced Data Engineering Lead to spearhead a critical engagement within our Capital Markets practice. Based in Bengaluru, this role involves supporting a prominent Private Equity firm headquartered in New York. The successful candidate will oversee the development and maintenance of high-impact data pipelines and lakehouse architectures using cutting-edge technologies. You will work closely with stakeholders to translate business requirements into technical specifications, ensuring high data quality and system reliability across the enterprise.
You will be responsible for leading an offshore team of 5 to 7 engineers, ensuring the delivery of production-grade data solutions through mentorship and technical oversight. This position requires deep expertise in Azure Databricks and PySpark, along with a solid understanding of data governance through Unity Catalog. Candidates must possess a strong background in U.S. Capital Markets or Private Equity to effectively meet the complex data needs of our clients. Successful applicants will demonstrate a history of architectural excellence and the ability to navigate complex financial data landscapes.
Key Requirements
15+ years of enterprise data engineering experience
Databricks Certified Data Engineer (mandatory certification)
5+ years of hands-on experience specifically on Azure Databricks
5+ years of hands-on PySpark experience with production-grade pipelines
Strong knowledge of Unity Catalog and data governance frameworks
Proven experience leading offshore teams of 5–7 engineers
Domain experience in U.S. Capital Markets, Private Equity, or Investment Management
Expertise in lakehouse architecture and modern data stack design
Advanced proficiency in SQL for complex data transformations
Strong understanding of CI/CD practices for automated data pipelines
~200,000 Mentioned
India, Bangalore
6 days ago
smartreferhub.in
506 Views
SmartReferHub is looking for a Lead Data Engineer with 6 to 8 years of experience to join their team in Bangalore. This high-impact hybrid role involves working on advanced Databricks and AWS Lakehouse architecture to lead large-scale data transformations. You will be responsible for driving enterprise-level analytics for global operations and accelerating the company's data strategy. The successful candidate will work on cutting-edge data technologies and lead impactful projects that shape the future of data engineering within the organization. Joining is expected within 30 days. The offered salary for this position ranges from ₹24 to ₹28 LPA, providing an excellent opportunity for career growth in the data analytics sector.
Key Requirements
Minimum 6 to 8 years of professional experience in data engineering roles.
Strong hands-on experience with Databricks and AWS Lakehouse architecture.
Proven track record of leading large-scale data transformations in an enterprise environment.
Ability to drive analytics solutions for global operations and cross-functional teams.
Deep expertise in Big Data technologies and cloud-based data ecosystems.
Strong proficiency in programming languages such as Python or Scala for data processing.
Expertise in writing complex SQL queries and optimizing data performance.
Solid understanding of ETL and ELT pipeline design and maintenance.
Experience with data modeling, data warehousing, and lakehouse concepts.
Strong leadership skills with the ability to manage technical projects and mentor team members.
0 Negotiable or Not Mentioned
India, Bengaluru
20 days ago
se-mentor.com
1093 Views
Join SE Mentor Solutions as a Big Data Engineer in Bengaluru. We are seeking a senior professional with a minimum of 8 years of experience to lead our big data initiatives. You will be at the forefront of designing and managing large-scale, complex data systems that power our organization's analytics and operations. This role demands a high level of expertise in modern data technologies and a passion for building robust, scalable architectures.
You will utilize your skills in SQL, Python, Databricks, and PySpark to develop and optimize data pipelines within the Azure Cloud ecosystem. As a senior engineer, you will also be responsible for mentoring junior staff and ensuring that all data solutions align with industry best practices. Your role is critical in transforming raw data into actionable insights while maintaining system performance and reliability. Please note that salary details were not included in the original job advertisement.
Key Requirements
Minimum of 8 years of experience in data engineering or big data roles.
Expert-level proficiency in SQL and Python.
Deep technical knowledge of Databricks and PySpark.
Significant experience working with Azure Cloud services.
Proven track record of designing and implementing large-scale data systems.
Knowledge of Hadoop ecosystem components and big data frameworks.
Ability to optimize system performance and manage large data volumes.
Experience in mentoring and leading technical teams.
Strong understanding of API integrations and data security.
Excellent problem-solving and strategic thinking capabilities.
0 Negotiable or Not Mentioned
India, Bangalore
5 days ago
careernet.in
439 Views
My client within the Pharmaceutical sector is looking to expand its technology hub located in Bangalore. We are seeking high-impact Senior AWS Data Engineers who are ready to build scalable data platforms and implement cutting-edge solutions. This role is crucial for managing the infrastructure that supports data-driven decision-making in the pharmaceutical industry and ensuring that large-scale data assets are accessible and reliable. The successful candidate will work extensively with AWS Glue, Lambda, and Databricks. You will be responsible for data modelling and processing using Python, PySpark, and SQL. This is a 100% work-from-office position in Bangalore, requiring candidates who are either currently serving their notice period or can join immediately within 30 days. Your expertise will directly contribute to the innovation of data architectures in a fast-paced environment.
Key Requirements
6–12 years of professional experience in data engineering
Expertise in AWS Glue and AWS Lambda for serverless computing
Proficiency in Databricks for unified analytics and data processing
Strong programming skills in Python for data manipulation
Advanced knowledge of PySpark for big data processing tasks
Hands-on experience with SQL for complex database queries
Proven track record in Data Modelling and architectural design
Experience in the pharmaceutical or life sciences sector
Ability to build and maintain scalable data platforms
Strong analytical and problem-solving skills in a cloud environment