~200,000 Mentioned
India, Bangalore
6 days ago
smartreferhub.in
428 Views
SmartReferHub is looking for a Lead Data Engineer with 6 to 8 years of experience to join their team in Bangalore. This high-impact hybrid role involves working on advanced Databricks and AWS Lakehouse architecture to lead large-scale data transformations. You will be responsible for driving enterprise-level analytics for global operations and accelerating the company's data strategy. The successful candidate will work on cutting-edge data technologies and lead impactful projects that shape the future of data engineering within the organization. Joining is expected within 30 days. The offered salary for this position ranges from ₹24 to ₹28 LPA, providing an excellent opportunity for career growth in the data analytics sector.
Key Requirements
Minimum 6 to 8 years of professional experience in data engineering roles.
Strong hands-on experience with Databricks and AWS Lakehouse architecture.
Proven track record of leading large-scale data transformations in an enterprise environment.
Ability to drive analytics solutions for global operations and cross-functional teams.
Deep expertise in Big Data technologies and cloud-based data ecosystems.
Strong proficiency in programming languages such as Python or Scala for data processing.
Expertise in writing complex SQL queries and optimizing data performance.
Solid understanding of ETL and ELT pipeline design and maintenance.
Experience with data modeling, data warehousing, and lakehouse concepts.
Strong leadership skills with the ability to manage technical projects and mentor team members.
0 Negotiable or Not Mentioned
India, Bangalore
5 days ago
careernet.in
358 Views
My client within the Pharmaceutical sector is looking to expand its technology hub located in Bangalore. We are seeking high-impact Senior AWS Data Engineers who are ready to build scalable data platforms and implement cutting-edge solutions. This role is crucial for managing the infrastructure that supports data-driven decision-making in the pharmaceutical industry and ensuring that large-scale data assets are accessible and reliable. The successful candidate will work extensively with AWS Glue, Lambda, and Databricks. You will be responsible for data modelling and processing using Python, PySpark, and SQL. This is a 100% work-from-office position in Bangalore, requiring candidates who are either currently serving their notice period or can join immediately within 30 days. Your expertise will directly contribute to the innovation of data architectures in a fast-paced environment.
Key Requirements
6–12 years of professional experience in data engineering
Expertise in AWS Glue and AWS Lambda for serverless computing
Proficiency in Databricks for unified analytics and data processing
Strong programming skills in Python for data manipulation
Advanced knowledge of PySpark for big data processing tasks
Hands-on experience with SQL for complex database queries
Proven track record in Data Modelling and architectural design
Experience in the pharmaceutical or life sciences sector
Ability to build and maintain scalable data platforms
Strong analytical and problem-solving skills in a cloud environment
0 Negotiable or Not Mentioned
India, Bengaluru
8 days ago
huemot.com
418 Views
We are seeking a highly experienced Data Engineering Lead to spearhead a critical engagement within our Capital Markets practice. Based in Bengaluru, this role involves supporting a prominent Private Equity firm headquartered in New York. The successful candidate will oversee the development and maintenance of high-impact data pipelines and lakehouse architectures using cutting-edge technologies. You will work closely with stakeholders to translate business requirements into technical specifications, ensuring high data quality and system reliability across the enterprise.
You will be responsible for leading an offshore team of 5 to 7 engineers, ensuring the delivery of production-grade data solutions through mentorship and technical oversight. This position requires deep expertise in Azure Databricks and PySpark, along with a solid understanding of data governance through Unity Catalog. Candidates must possess a strong background in U.S. Capital Markets or Private Equity to effectively meet the complex data needs of our clients. Successful applicants will demonstrate a history of architectural excellence and the ability to navigate complex financial data landscapes.
Key Requirements
15+ years of enterprise data engineering experience
Databricks Certified Data Engineer (mandatory certification)
5+ years of hands-on experience specifically on Azure Databricks
5+ years of hands-on PySpark experience with production-grade pipelines
Strong knowledge of Unity Catalog and data governance frameworks
Proven experience leading offshore teams of 5–7 engineers
Domain experience in U.S. Capital Markets, Private Equity, or Investment Management
Expertise in lakehouse architecture and modern data stack design
Advanced proficiency in SQL for complex data transformations
Strong understanding of CI/CD practices for automated data pipelines
0 Negotiable or Not Mentioned
India, Bengaluru
20 days ago
se-mentor.com
1093 Views
Join SE Mentor Solutions as a Big Data Engineer in Bengaluru. We are seeking a senior professional with a minimum of 8 years of experience to lead our big data initiatives. You will be at the forefront of designing and managing large-scale, complex data systems that power our organization's analytics and operations. This role demands a high level of expertise in modern data technologies and a passion for building robust, scalable architectures.
You will utilize your skills in SQL, Python, Databricks, and PySpark to develop and optimize data pipelines within the Azure Cloud ecosystem. As a senior engineer, you will also be responsible for mentoring junior staff and ensuring that all data solutions align with industry best practices. Your role is critical in transforming raw data into actionable insights while maintaining system performance and reliability. Please note that salary details were not included in the original job advertisement.
Key Requirements
Minimum of 8 years of experience in data engineering or big data roles.
Expert-level proficiency in SQL and Python.
Deep technical knowledge of Databricks and PySpark.
Significant experience working with Azure Cloud services.
Proven track record of designing and implementing large-scale data systems.
Knowledge of Hadoop ecosystem components and big data frameworks.
Ability to optimize system performance and manage large data volumes.
Experience in mentoring and leading technical teams.
Strong understanding of API integrations and data security.
Excellent problem-solving and strategic thinking capabilities.
0 Negotiable or Not Mentioned
India, Bangalore
10 days ago
fxconsulting.in
774 Views
We are seeking a highly skilled Technical Lead for Data Engineering to join our dynamic team in Bangalore. This role is centered on building and scaling high-performance data systems that support our product-driven initiatives. As a lead, you will be at the forefront of designing scalable ETL pipelines and leveraging technologies such as Spark, Hadoop, and Kafka for large-scale data processing. Your expertise will ensure that our data infrastructure is robust, efficient, and capable of handling complex data workloads.
In addition to your technical responsibilities, you will provide leadership to the engineering team and work collaboratively with Data Scientists to optimize data models and ensure top-tier data quality and security. You will be expected to monitor and troubleshoot data pipelines while maintaining high standards for data governance. The ideal candidate brings 6 to 9 years of experience, a strong background in Python or Scala, and a deep understanding of cloud platforms like AWS, Azure, or GCP. This is a fantastic opportunity for a professional looking to lead engineering excellence in a fast-paced environment.
Key Requirements
6 to 9 years of professional experience in Data Engineering.
Proven expertise in Spark and other Big Data technologies.
Proficiency in coding with Python, Scala, or Java.
Extensive experience in developing and optimizing ETL pipelines.
Hands-on experience with cloud platforms such as AWS, Azure, or GCP.
Strong knowledge of Hadoop and Kafka for large-scale data processing.
Demonstrated experience in team handling and leadership roles.
Ability to design and optimize complex data models.
Understanding of data quality, governance, and security principles.
Exceptional problem-solving skills and ability to work in fast-paced environments.
0 Negotiable or Not Mentioned
India, Bengaluru
9 days ago
scienstechnologies.com
650 Views
Sciens Technologies is seeking a dedicated Database Support Engineer with specialized expertise in AWS database services to join our team in Bengaluru. This role is pivotal in managing and supporting a variety of AWS databases, including RDS, Aurora, DynamoDB, DocumentDB, and ElastiCache. As part of a global "follow-the-sun" model, you will be responsible for ensuring the high availability, reliability, and performance of mission-critical systems. This involves active participation in incident management, root cause analysis (RCA), and providing 24/7 production support to maintain seamless operations for our global clients.
Beyond routine maintenance, the ideal candidate will drive performance optimization through query tuning and strategic indexing. You will also leverage automation using Python or Bash to streamline database operations and enhance system efficiency. Collaboration is a key component of this role, as you will work closely with global teams to maintain comprehensive documentation and runbooks. Monitoring system health using tools like CloudWatch, Prometheus, and Grafana will be part of your daily activities to ensure proactive issue resolution and disaster recovery preparedness. Candidates with a background in Kubernetes and Terraform are highly encouraged to apply.
Key Requirements
3–5 years of professional Database Administration experience.
Strong hands-on experience specifically with AWS RDS and Aurora services.
Profound knowledge of backup/recovery, High Availability (HA), and Disaster Recovery (DR) strategies.
Proficiency in scripting languages such as Python or Bash for operational automation.
Hands-on experience with monitoring and alerting tools like CloudWatch, Prometheus, or Grafana.
Proven ability to handle incident management and root cause analysis (RCA) in a production environment.
Expertise in SQL query tuning, indexing strategies, and database performance optimization.
Familiarity with AWS Database Migration Service (DMS) and general migration strategies.
Knowledge of Kubernetes (EKS) and managing containerized database environments.
Understanding of Infrastructure as Code (IaC) principles using Terraform or CloudFormation.
0 Negotiable or Not Mentioned
India, Coimbatore
15 days ago
infolexus.com
1730 Views
Infolexus is currently recruiting on behalf of a prominent client in the Information Technology sector. This role is designed for a DevOps Engineer who is eager to contribute to a dynamic team environment and work with cutting-edge cloud technologies. The successful candidate will be responsible for building, maintaining, and optimizing scalable systems, ensuring high availability and performance across various platforms. This position offers an excellent opportunity for both freshers and experienced professionals to grow their careers within a forward-thinking organization.
As a DevOps Engineer based in Coimbatore, you will work closely with development and operations teams to streamline deployment processes and enhance infrastructure. You will utilize your knowledge of Linux systems, cloud environments like AWS or Azure, and automation tools to drive efficiency. The role involves managing CI/CD pipelines, writing scripts to automate repetitive tasks, and troubleshooting complex technical issues. Join a team that values innovation and technical excellence while pushing the boundaries of modern IT infrastructure.
Key Requirements
Basic knowledge of Linux/Unix systems
Familiarity with cloud platforms (AWS, Azure, or GCP)
Understanding of CI/CD tools (Jenkins, GitHub Actions)
Knowledge of scripting (Python, Bash, or similar)
0–3 years of experience in a relevant technical role
Ability to build and maintain scalable systems
Eagerness to work with modern cloud technologies
Strong analytical and problem-solving skills
Excellent communication and collaboration abilities
Proactive approach to learning new DevOps tools and methodologies
0 Negotiable or Not Mentioned
India, Bangalore
21 days ago
quesscorp.com
1429 Views
Quess Corp is seeking a dedicated Technical Application Support specialist to join their team in Bangalore. This hybrid role requires three days of in-office work per week, blending remote flexibility with essential on-site collaboration. Candidates must be currently based in Bangalore and available to join immediately, as the recruitment process involves both virtual and face-to-face interview components. The position is ideal for professionals with 5 to 7 years of experience who can navigate complex technical landscapes with ease. The primary responsibility involves supporting various applications by leveraging strong cloud platform knowledge, particularly in AWS, and ensuring high system availability. You will work with web servers, APIs, and SQL-based database systems to resolve technical issues and optimize performance. Effective incident management using tools like Jira and ServiceNow is a core part of the daily workflow, requiring a structured approach to problem-solving and excellent communication skills to coordinate with stakeholders and team members.
Key Requirements
Strong understanding of cloud platforms, specifically Amazon Web Services (AWS)
Familiarity with web servers and RESTful APIs
Proficiency in database systems and SQL for data management
Experience with ticketing systems such as Jira or ServiceNow
Knowledge of incident management processes and industry best practices
Basic proficiency with Linux/Unix commands and system troubleshooting
Excellent communication and interpersonal skills
Strong analytical and problem-solving abilities
Ability to work in a fast-paced environment and handle multiple priorities
Minimum of 5-7 years of experience in technical application support
Must be a resident of Bangalore, India
Availability to join the organization immediately
0 Negotiable or Not Mentioned
India, Bangalore
3 days ago
yeswayconsultancy.in
281 Views
Yesway Consultancy is seeking a high-impact Business Analyst specializing in Analytics and Business Intelligence to join the team in Bangalore. This role is designed for a professional with excellent communication skills and a strong ability to manage stakeholders, confidently engaging with senior leadership to drive analytics initiatives from conception to completion. The ideal candidate will be a proactive problem-solver who can lead business discussions, identify key analytics use cases, and translate business needs into technical problem statements and actionable solutions.
The successful candidate will be responsible for the end-to-end requirements of analytics projects, including the definition of enterprise KPIs, metrics, and performance frameworks. Technical proficiency in BI platforms like Power BI or Tableau, combined with deep exposure to the SAP Analytics ecosystem (BW, HANA, S/4), is essential. You will also be tasked with creating detailed documentation such as functional specifications, KPI logic, and data mappings, while ensuring data integrity through rigorous dashboard and data validation techniques. Immediate joiners or those with a notice period of up to 15 days are preferred.
Key Requirements
Proven experience as a Business Analyst specifically within Analytics, BI, or Data environments.
Demonstrated experience leading complex, cross-functional analytics initiatives.
Strong domain expertise in at least one area: Supply Chain, Finance, Sales, Manufacturing, or Operations.
Extensive stakeholder management experience, particularly with senior business leaders.
Hands-on experience with BI and Analytics platforms such as Power BI or Tableau.
Deep exposure to the SAP Analytics ecosystem, including BW, HANA, and S/4.
Strong understanding of Data Modelling, KPI frameworks, and semantic layers.
Proficiency in dashboard validation and data validation techniques.
Working knowledge of SQL for data querying and analysis.
Excellent communication skills and the ability to lead high-level business discussions.
Ability to create detailed functional and analytical documentation.
0 Negotiable or Not Mentioned
India, Bangalore
18 days ago
bridgetownresearch.org
1207 Views
Bridgetown Research is currently looking for talented SDE2 and SDE3 engineers to join our small and focused team in Bangalore. We are dedicated to building AI-native products that address real-world challenges through innovative technology. Our team operates with a low-ego, high-ownership culture where every member is expected to think like an owner and take full responsibility for the outcomes of their work, not just the tasks assigned. This hybrid role is based out of our modern office in Koramangala, offering a collaborative environment for engineers who thrive on solving complex problems and working on cutting-edge software solutions.
In this role, you will be responsible for developing robust backend systems using Node.js, TypeScript, and Python. You will play a critical role in building and scaling distributed systems on AWS, managing various database technologies such as Postgres, DynamoDB, and Elasticsearch, and ensuring the overall security and reliability of our platforms. Candidates should be comfortable working with infrastructure tools like Docker, Kubernetes, and Terraform to maintain a scalable and efficient environment. Compensation for this position is competitive and is offered at or above current market standards for experienced engineers in the region.
Key Requirements
Strong backend experience with 5-8+ years in the industry.
Proficiency in Node.js and TypeScript for building scalable applications.
Excellent programming skills in Python.
Familiarity with AI systems and AI-native product development.
Extensive experience building and scaling systems on AWS cloud infrastructure.
Comfortable working with relational and NoSQL databases like Postgres and DynamoDB.
Experience with search and indexing tools such as Elasticsearch.
Hands-on experience with containerization using Docker and orchestration with Kubernetes.
Knowledge of Infrastructure as Code using Terraform.
Ability to handle data at scale and implement performance improvements.
Solid understanding of software security best practices and building reliable systems.
A low-ego mindset with a high degree of ownership and responsibility.