~140,000 Mentioned
India, Remote
6 days ago
talentquell.com
191 Views
We are seeking a highly skilled MLOps Engineer to spearhead our cross-cloud machine learning model migration initiatives, specifically moving from GCP to Azure Databricks. The successful candidate will be responsible for building and optimizing production-grade MLOps workflows and CI/CD pipelines while implementing MLflow for meticulous model tracking and lifecycle management. You will develop scalable pipelines using Databricks and PySpark, ensuring seamless data movement and high model reliability. The budget for this position is ₹1.4 LPM.
In this role, you will also focus on performance and cost optimization of machine learning infrastructures. You will work closely with Data Science teams to enable efficient model deployment and monitoring across cloud environments. This position operates on a UK time shift (8 AM – 5 PM) and offers a remote working arrangement within India. Candidates should have a strong foundation in data engineering and pipeline orchestration to succeed in this dynamic environment.
Key Requirements
Minimum 6–8 years of professional experience in MLOps or a related technical role.
Strong expertise in Databricks and PySpark for large-scale data processing.
Hands-on experience with MLflow for tracking experiments and managing model lifecycles.
Proven proficiency in CI/CD practices and workflows using tools like GitHub Actions.
Extensive experience working with cloud platforms, specifically Microsoft Azure and GCP.
Demonstrated ability to perform cross-cloud data movement and model migration tasks.
In-depth knowledge of model deployment strategies and continuous monitoring systems.
Strong background in data engineering and the orchestration of complex pipelines.
Excellent communication and collaboration skills for working with Data Science teams.
Availability to work according to the UK shift timings from 8 AM to 5 PM.
~200,000 Mentioned
India, Bangalore
6 days ago
smartreferhub.in
191 Views
SmartReferHub is looking for a Lead Data Engineer with 6 to 8 years of experience to join their team in Bangalore. This high-impact hybrid role involves working on advanced Databricks and AWS Lakehouse architecture to lead large-scale data transformations. You will be responsible for driving enterprise-level analytics for global operations and accelerating the company's data strategy. The successful candidate will work on cutting-edge data technologies and lead impactful projects that shape the future of data engineering within the organization. Joining is expected within 30 days. The offered salary for this position ranges from ₹24 to ₹28 LPA, providing an excellent opportunity for career growth in the data analytics sector.
Key Requirements
Minimum 6 to 8 years of professional experience in data engineering roles.
Strong hands-on experience with Databricks and AWS Lakehouse architecture.
Proven track record of leading large-scale data transformations in an enterprise environment.
Ability to drive analytics solutions for global operations and cross-functional teams.
Deep expertise in Big Data technologies and cloud-based data ecosystems.
Strong proficiency in programming languages such as Python or Scala for data processing.
Expertise in writing complex SQL queries and optimizing data performance.
Solid understanding of ETL and ELT pipeline design and maintenance.
Experience with data modeling, data warehousing, and lakehouse concepts.
Strong leadership skills with the ability to manage technical projects and mentor team members.
0 Negotiable or Not Mentioned
India, Bengaluru
8 days ago
huemot.com
295 Views
We are seeking a highly experienced Data Engineering Lead to spearhead a critical engagement within our Capital Markets practice. Based in Bengaluru, this role involves supporting a prominent Private Equity firm headquartered in New York. The successful candidate will oversee the development and maintenance of high-impact data pipelines and lakehouse architectures using cutting-edge technologies. You will work closely with stakeholders to translate business requirements into technical specifications, ensuring high data quality and system reliability across the enterprise.
You will be responsible for leading an offshore team of 5 to 7 engineers, ensuring the delivery of production-grade data solutions through mentorship and technical oversight. This position requires deep expertise in Azure Databricks and PySpark, along with a solid understanding of data governance through Unity Catalog. Candidates must possess a strong background in U.S. Capital Markets or Private Equity to effectively meet the complex data needs of our clients. Successful applicants will demonstrate a history of architectural excellence and the ability to navigate complex financial data landscapes.
Key Requirements
15+ years of enterprise data engineering experience
Databricks Certified Data Engineer (mandatory certification)
5+ years of hands-on experience specifically on Azure Databricks
5+ years of hands-on PySpark experience with production-grade pipelines
Strong knowledge of Unity Catalog and data governance frameworks
Proven experience leading offshore teams of 5–7 engineers
Domain experience in U.S. Capital Markets, Private Equity, or Investment Management
Expertise in lakehouse architecture and modern data stack design
Advanced proficiency in SQL for complex data transformations
Strong understanding of CI/CD practices for automated data pipelines
0 Negotiable or Not Mentioned
India
7 days ago
jyotistructures.in
287 Views
Jyoti Structures is seeking experienced Section Engineers to support its growing portfolio of transmission and infrastructure projects throughout India. This role is crucial for managing specific sections of large-scale projects, ensuring that execution meets the company’s high standards for quality and safety. You will be a key member of the on-ground execution team, contributing to the development of critical national infrastructure and managing various project sites across India.
As a Section Engineer, you will focus on operational excellence and engineering depth. The position offers the chance to work in an execution-driven culture where safety and integrity are prioritized. You will be involved in coordinating activities at project sites across India, ensuring that transmission line installations are completed efficiently and according to technical specifications. Join a team dedicated to lighting up the future through engineering excellence and nation-building initiatives.
Key Requirements
Experience in section-wise management of transmission line projects.
Proficiency in site execution and managing diverse workforces on-ground.
Strong technical knowledge of infrastructure construction and engineering.
Ability to ensure strict adherence to safety and quality protocols.
Previous experience working on large-scale EPC transmission projects.
Effective communication skills for seamless team coordination and leadership.
Proven problem-solving skills for addressing on-site engineering challenges.
Knowledge of project scheduling, resource allocation, and site logistics.
Degree in Civil or Electrical Engineering from a recognized institution.
Willingness to relocate or travel to various project sites across India.
0 Negotiable or Not Mentioned
India, Bangalore
10 days ago
fxconsulting.in
571 Views
We are seeking a highly skilled Technical Lead for Data Engineering to join our dynamic team in Bangalore. This role is centered on building and scaling high-performance data systems that support our product-driven initiatives. As a lead, you will be at the forefront of designing scalable ETL pipelines and leveraging technologies such as Spark, Hadoop, and Kafka for large-scale data processing. Your expertise will ensure that our data infrastructure is robust, efficient, and capable of handling complex data workloads.
In addition to your technical responsibilities, you will provide leadership to the engineering team and work collaboratively with Data Scientists to optimize data models and ensure top-tier data quality and security. You will be expected to monitor and troubleshoot data pipelines while maintaining high standards for data governance. The ideal candidate brings 6 to 9 years of experience, a strong background in Python or Scala, and a deep understanding of cloud platforms like AWS, Azure, or GCP. This is a fantastic opportunity for a professional looking to lead engineering excellence in a fast-paced environment.
Key Requirements
6 to 9 years of professional experience in Data Engineering.
Proven expertise in Spark and other Big Data technologies.
Proficiency in coding with Python, Scala, or Java.
Extensive experience in developing and optimizing ETL pipelines.
Hands-on experience with cloud platforms such as AWS, Azure, or GCP.
Strong knowledge of Hadoop and Kafka for large-scale data processing.
Demonstrated experience in team handling and leadership roles.
Ability to design and optimize complex data models.
Understanding of data quality, governance, and security principles.
Exceptional problem-solving skills and ability to work in fast-paced environments.
~291,666 Mentioned
India, Mumbai
11 days ago
nextjobhunt.com
701 Views
We are seeking a highly experienced Data Engineer for a high-impact role with a leading financial institution (semi-government Bank) located in BKC, Mumbai. In this position, you will be responsible for designing and building scalable batch and real-time data pipelines, as well as developing data models, marts, and feature stores for advanced analytics and reporting. You will also play a critical role in implementing data quality, lineage, and governance frameworks to ensure data security and compliance with regulatory standards. The role is a full-time contract for an initial three-year term, which is extendable. The salary offered for this position is 35–40 LPA.
As a core member of the data team, you will support data science and analytics units with optimized datasets that impact credit, risk, and banking operations. You will work on enterprise-level data platforms using modern technologies such as Spark, Python, and various cloud platforms. This is a unique opportunity for a professional with over 8 years of experience to work in a high-stakes environment within the financial services sector, contributing to the development of robust data infrastructures that drive business decisions.
Key Requirements
Minimum of 8 years of professional experience in Data Engineering or related roles.
Strong proficiency in SQL and Data Modelling including OLTP/OLAP and Star/Snowflake schemas.
Hands-on experience with programming languages such as Python, Scala, or Java combined with Spark.
Proven experience working with ETL/ELT tools like Airflow or Azure Data Factory.
Significant exposure to major Cloud Platforms including AWS, Azure, or GCP.
In-depth knowledge of NoSQL and Graph Databases for varied data storage needs.
Solid understanding of Data Governance, Master Data Management (MDM), and Data Quality standards.
Previous professional experience in Banking, NBFC, Financial Services, or Fintech industries.
Deep understanding of regulatory and compliance data requirements specific to the financial sector.
Relevant certifications in Cloud Computing or Data Engineering are highly preferred.
Ability to build and maintain scalable real-time and batch data pipelines.
Strong communication skills to collaborate with data science and operations teams.
0 Negotiable or Not Mentioned
India, Navi Mumbai
9 days ago
walkingtree.tech
398 Views
We are seeking a highly skilled Automation Specialist with an expertise in Java and Selenium to join our dynamic QA Engineering team in Navi Mumbai. The ideal candidate will be responsible for developing and maintaining robust automated test scripts using Selenium WebDriver, TestNG, and Cucumber frameworks. You will play a crucial role in designing scalable automation frameworks using Maven and managing version control through Git/GitHub to ensure high-quality software delivery and stability across releases.
In this role, you will collaborate closely with development teams to analyze test results, identify defects, and resolve technical issues. You will also be tasked with managing regression test suites and integrating them into CI/CD pipelines using tools like Jenkins to streamline the software development lifecycle. We value professionals who write clean, reusable code and possess excellent problem-solving skills. Candidates with additional experience in API testing using Postman or RestAssured and those holding ISTQB certifications are highly encouraged to apply.
Key Requirements
Bachelor’s Degree in Engineering or MCA.
3 to 4 years of professional experience in Automation Testing.
Advanced proficiency in Java programming language.
Extensive hands-on experience with Selenium WebDriver for web applications.
Proven ability to work with TestNG and Cucumber (BDD) frameworks.
Experience in building and managing automation projects using Maven.
Solid understanding of version control systems, specifically Git and GitHub.
Experience with CI/CD tools, particularly Jenkins, for continuous integration.
Strong skills in analyzing test results and performing root cause analysis for defects.
Excellent verbal and written communication skills for team collaboration.
~291,666 Mentioned
India, Mumbai
11 days ago
nextjobhunt.com
607 Views
We are currently seeking a highly skilled Data Engineer to join a leading semi-government bank in Mumbai. This is a high-impact role within the financial sector, where you will be responsible for designing and building scalable batch and real-time data pipelines. You will work on enterprise-level data platforms that significantly impact credit, risk, and operations. The position is a full-time contract for an initial period of three years, with the possibility of extension based on performance and project requirements. The CTC for this role is between 35–40 LPA.
In this role, you will develop data models, marts, and feature stores for advanced analytics and reporting. You will also be tasked with implementing data quality, lineage, and governance frameworks to ensure data integrity across the organization. Security and compliance are paramount, so you must ensure all data platforms align with regulatory standards. You will provide critical support to data science and analytics teams by providing optimized datasets. Candidates should have a strong background in SQL, Spark, and Python/Java/Scala, alongside experience with cloud platforms such as AWS, Azure, or GCP.
Key Requirements
Minimum of 8 years of experience in Data Engineering or a similar role.
Strong proficiency in SQL and Data Modelling using OLTP/OLAP and Star/Snowflake schemas.
Hands-on experience with programming languages like Python, Scala, or Java.
Expertise in Big Data processing frameworks, specifically Spark.
Experience in building and managing ETL/ELT pipelines using tools like Airflow or Data Factory.
Extensive exposure to Cloud Platforms including AWS, Azure, or GCP.
Knowledge of NoSQL and Graph Databases.
Demonstrated understanding of Data Governance, Master Data Management (MDM), and Data Quality.
Prior experience working within the Banking, NBFC, or Financial Services industry.
Understanding of regulatory and compliance data requirements for the financial sector.
0 Negotiable or Not Mentioned
India, Remote
14 days ago
sapphiresoftwaresolutions.com
1297 Views
We are seeking a skilled Data Engineer to join a fast-growing team supporting major global brands like KFC, Pizza Hut, and Taco Bell. This is a fully remote role based in India, with a shift schedule of 12 PM to 9 PM IST. The initial contract duration is three months, with a high probability of extension based on performance and project needs. You will be responsible for building and optimizing data pipelines using Informatica IICS and Snowflake, focusing on scalable data integration frameworks within an AWS cloud environment. The ideal candidate should have at least 2 years of experience in data engineering, with strong technical skills in Python scripting, SQL, and event-driven architectures. You will work on impactful global projects, supporting advanced analytics and AI/ML initiatives while collaborating with dedicated DevOps teams. Exposure to Airflow and streaming pipelines such as Kafka or AWS Streaming is highly desirable. This is an excellent opportunity to work on modern data platforms and drive data-driven decisions for world-class organizations.
Key Requirements
2+ years of professional Data Engineering experience.
Proficiency in Informatica Cloud (IICS) as the primary ETL tool.
Hands-on experience with Snowflake as a target data platform.
Strong expertise in AWS (Amazon Web Services) cloud environment.
Advanced knowledge of SQL for complex data queries and manipulation.
Solid programming skills in Python for scripting and automation.
Experience handling both structured and semi-structured data formats.
Familiarity with REST APIs and AWS Lambda functions.
Ability to work the 12 PM to 9 PM IST shift.
Capacity to collaborate effectively with cross-functional teams and DevOps.
0 Negotiable or Not Mentioned
India, Pune
10 days ago
leopardequipments.com
421 Views
We are looking for an Engineer to support our industrial operations at Leopard Equipments and Engineers Pvt Ltd. This position is based in Ranjangaon, Pune, and is perfect for early-career engineers looking to apply their mechanical or electrical knowledge in a real-world industrial setting. The role involves working on high-impact projects and collaborating with senior engineers to ensure operational efficiency.
The ideal candidate should be enthusiastic about learning and capable of handling technical challenges across various industrial environments. You will gain hands-on experience with industrial machinery and contribute to the success of our engineering team. We offer a supportive environment that encourages career progression and skill enhancement through mentorship and diverse project exposure.
Key Requirements
Diploma or BE/B. Tech in Mechanical or Electrical engineering.
0-2 years of relevant experience in an engineering role.
Solid understanding of core engineering principles and applications.
Ability to handle technical documentation and detailed reporting.
Proficiency in diagnosing and resolving industrial technical issues.
Experience or interest in industrial equipment maintenance.
Team-oriented mindset with strong problem-solving skills.
Adaptability to work in various industrial and site environments.
Strong attention to detail and strict adherence to safety protocols.
Commitment to continuous learning and professional improvement.