0 Negotiable or Not Mentioned
India
20 days ago
se-mentor.com
1402 Views
We are looking for an experienced Databricks Engineer to enhance our data engineering capabilities at SE Mentor Solutions. In this role, you will be leveraging Azure services, PySpark, and Databricks to build high-performance data frameworks. Your contributions will help drive our data-driven decision-making processes by ensuring the reliability and scalability of our cloud-based data systems. This position offers the flexibility to work from either Cochin or Bengaluru.
Candidates should possess at least 5 years of experience in the data engineering field with a focus on Azure technologies. You will be responsible for designing and implementing efficient data processing logic and managing complex data architectures within the Azure cloud environment. As part of our team, you will participate in the full lifecycle of data projects, from requirements gathering to deployment and monitoring. Salary information was not provided in the original job post.
Key Requirements
Over 5 years of experience in data engineering or related technical roles.
Strong expertise in SQL and PySpark for data processing tasks.
Hands-on experience with Azure Databricks and Azure Data Factory.
Familiarity with Azure Cloud infrastructure and related services.
Experience in building and optimizing large-scale data architectures.
Ability to design and implement automated data pipelines.
Understanding of data warehousing concepts and technologies.
Strong troubleshooting skills for cloud-based data environments.
Effective teamwork and communication skills.
Commitment to maintaining high standards of data security and integrity.
0 Negotiable or Not Mentioned
India
20 days ago
pyxidiatech.com
1237 Views
We are seeking a highly skilled Senior Data Analyst to join our dynamic team and play a pivotal role in designing, building, and scaling innovative data solutions across our products and client implementations. The successful candidate will be responsible for developing scalable data pipelines, optimizing ETL workflows, and ensuring the highest standards of data quality and reliability. You will work closely with cross-functional teams, including Product, Engineering, and Data Science, to drive data architecture decisions and deliver actionable insights that solve complex business challenges. This role offers the opportunity to provide technical guidance and mentorship to junior analysts while working on impactful, data-driven projects. This position is available in multiple locations across India, specifically Mumbai, Bangalore, and Pune. The ideal candidate will have over four years of experience in data-focused roles and a deep understanding of the AWS ecosystem, including Redshift, Athena, and EMR. Experience with US Healthcare Data is considered a significant advantage. Candidates must be proficient in Python, PySpark, and SQL, and possess a strong grasp of data modeling and performance optimization. If you are passionate about big data and looking to make a significant impact in a collaborative environment, we encourage you to apply.
Key Requirements
Minimum of 4 years of professional experience in Data Analytics or Data Engineering.
Demonstrated expertise in SQL and both relational and NoSQL database management.
Hands-on proficiency with Python and PySpark for processing large-scale datasets.
Proven experience in building and optimizing ETL/ELT pipelines using Airflow or AWS Glue.
Strong conceptual understanding of data modeling and performance tuning.
Advanced technical knowledge of AWS data services including S3, EMR, Redshift, and Athena.
Ability to design and manage complex data architectures across structured and unstructured sources.
Competency in maintaining high standards for data quality, validation, and monitoring.
Strategic thinking skills to perform analysis and generate actionable business insights.
Strong collaborative skills to work effectively with Product, Engineering, and Client teams.
0 Negotiable or Not Mentioned
India
6 hours ago
emperentech.com
39 Views
Emperen Technologies is seeking elite Databricks talent to join our global team of experts. As an Official Databricks Partner, we specialize in helping enterprises scale their data transformation initiatives faster, smarter, and more cost-efficiently. We are looking for professionals who can hit the ground running on a contract or hourly basis to meet urgent delivery needs for our diverse portfolio of enterprise clients.
Candidates will be responsible for leveraging Azure Databricks, Spark, and PySpark to build robust data pipelines and architectures. The role involves deep involvement in data migration, modernization, and the integration of AI/ML models into existing business analytics frameworks. You will work closely with Data Engineers and Architects to enable outcomes that drive business value. If you possess deep technical capability and a proven track record in data initiatives, we encourage you to apply.
Key Requirements
Proficiency in Azure Databricks and Apache Spark ecosystems.
Strong experience with PySpark for large-scale data processing.
Solid background in Data Engineering and Data Architecture principles.
Expertise in Data Migration and Modernization of legacy systems.
Ability to integrate AI/ML and Analytics into production data pipelines.
Available to work on a Contract and Hourly Basis for urgent delivery.
Strong communication skills for collaborating with CTOs and Head of Data roles.
Experience with cloud infrastructure and security best practices.
Proven ability to deliver high-quality talent outcomes in fast-paced environments.
Knowledge of Spark optimization and performance tuning techniques.
0 Negotiable or Not Mentioned
India
16 days ago
nishtechnologies.com
936 Views
Nish Technologies is seeking a skilled Data Engineer to join a prestigious Big4 MNC client on a full-time basis. The ideal candidate will have between 5 and 8 years of professional experience in data engineering and will be responsible for designing and implementing efficient data solutions. The primary work locations for this role are Hyderabad and Bangalore. This position requires a strong technical background and the ability to work in a fast-paced, high-impact environment.
Candidates must be proficient in Python, SQL, and PySpark to handle complex data sets and pipelines. We are conducting a virtual weekend recruitment drive on Saturday, April 4th. This role is intended for immediate joiners or those with a notice period of up to 15 days. Interested professionals are encouraged to share their profiles for consideration in this expedited hiring process.
Key Requirements
5 to 8 years of relevant experience in Data Engineering.
Advanced proficiency in Python programming for data processing.
Expertise in writing complex SQL queries for database management.
In-depth knowledge of PySpark and its application in big data projects.
Ability to join immediately or within a maximum notice period of 15 days.
Experience working within a Big4 MNC or similar large-scale environment.
Strong analytical skills to solve complex data-related problems.
Familiarity with ETL processes and data pipeline orchestration.
Excellent communication skills for cross-functional team collaboration.
Availability to participate in the virtual weekend recruitment drive on April 4th.
0 Negotiable or Not Mentioned
India, Remote
14 days ago
sapphiresoftwaresolutions.com
1302 Views
We are seeking a skilled Data Engineer to join a fast-growing team supporting major global brands like KFC, Pizza Hut, and Taco Bell. This is a fully remote role based in India, with a shift schedule of 12 PM to 9 PM IST. The initial contract duration is three months, with a high probability of extension based on performance and project needs. You will be responsible for building and optimizing data pipelines using Informatica IICS and Snowflake, focusing on scalable data integration frameworks within an AWS cloud environment. The ideal candidate should have at least 2 years of experience in data engineering, with strong technical skills in Python scripting, SQL, and event-driven architectures. You will work on impactful global projects, supporting advanced analytics and AI/ML initiatives while collaborating with dedicated DevOps teams. Exposure to Airflow and streaming pipelines such as Kafka or AWS Streaming is highly desirable. This is an excellent opportunity to work on modern data platforms and drive data-driven decisions for world-class organizations.
Key Requirements
2+ years of professional Data Engineering experience.
Proficiency in Informatica Cloud (IICS) as the primary ETL tool.
Hands-on experience with Snowflake as a target data platform.
Strong expertise in AWS (Amazon Web Services) cloud environment.
Advanced knowledge of SQL for complex data queries and manipulation.
Solid programming skills in Python for scripting and automation.
Experience handling both structured and semi-structured data formats.
Familiarity with REST APIs and AWS Lambda functions.
Ability to work the 12 PM to 9 PM IST shift.
Capacity to collaborate effectively with cross-functional teams and DevOps.
0 Negotiable or Not Mentioned
India, Remote
6 days ago
tekvo.io
552 Views
Tekvo is looking for a seasoned and highly motivated Azure Technical Lead to spearhead our cloud data engineering projects. In this critical role, you will be the driving force behind large-scale Azure analytics initiatives, overseeing the end-to-end development of high-impact data platforms. You will be responsible for defining the solution design, ensuring technical excellence across the delivery lifecycle, and providing strategic guidance to engineering teams. Your expertise will directly contribute to the creation of scalable, robust, and efficient data architectures that empower our clients to make data-driven decisions.
As an Azure Technical Lead, you must demonstrate mastery over core Azure data services such as Data Factory, Synapse, and Databricks. The role demands a blend of deep technical proficiency in SQL and Python along with the leadership skills required to mentor developers and manage complex stakeholder expectations. This remote position offers a unique opportunity for professionals based in India to work on cutting-edge cloud technologies within a collaborative environment. Successful candidates will be expected to maintain high standards of code quality and architectural integrity while driving innovation in the data engineering space.
Key Requirements
Possess 10-14 years of professional experience in data engineering and cloud platforms.
Demonstrate expert-level proficiency in designing and implementing Azure Data Factory pipelines.
Have hands-on experience with Azure Synapse Analytics for enterprise data warehousing.
Show strong technical expertise in using Azure Databricks for big data processing.
Maintain advanced knowledge of SQL for complex data manipulation and performance tuning.
Exhibit proficiency in Python programming for automating data workflows and engineering tasks.
Prove a track record of leading and delivering large-scale analytics initiatives on Azure.
Possess strong solution design skills with the ability to create scalable data architectures.
Demonstrate the ability to guide, mentor, and manage high-performing technical teams.
Experience in cloud security best practices and data governance frameworks is highly preferred.
Excellent communication skills to interact with stakeholders and translate business needs into technical solutions.
0 Negotiable or Not Mentioned
India, Remote
18 days ago
idtsolution.in
1769 Views
We are looking for a skilled and passionate Data Engineer to join our advanced analytics team. The ideal candidate will have hands-on experience in IICS, Snowflake, SQL, and Cloud platforms, with a strong foundation in building scalable data pipelines and modern data integration frameworks. You will play a key role in developing data-driven solutions to support AI/ML initiatives and enhance customer experience across global operations. The role requires working from 12:00 PM to 9:00 PM IST to align with team requirements.
Your core responsibilities will include designing ETL/ELT processes and managing cloud-based data solutions on platforms like AWS, Azure, or GCP. You will handle both structured and semi-structured data while implementing essential data quality and monitoring processes. Collaboration with cross-functional teams is vital to support business growth initiatives. The package for this role is up to ₹14 LPA, reflecting the expertise required to handle complex data architectures and DevOps practices within a modern data stack environment.
Key Requirements
2+ years of experience in Data Engineering.
Strong hands-on experience with Snowflake.
Proficiency in Cloud platforms (AWS, Azure, or GCP).
Expert knowledge of SQL and Python.
Extensive experience with Informatica Cloud (IICS) and ETL tools.
Proven ability to build scalable data pipelines and cloud-based data solutions.
Knowledge of serverless architectures and APIs.
Familiarity with DevOps practices including CI/CD and IaC.
Experience with streaming tools like Kafka or Spark Streaming.
Bachelor’s degree in a relevant field such as Computer Science or Engineering.
0 Negotiable or Not Mentioned
India
20 days ago
se-mentor.com
1045 Views
SE Mentor Solutions is seeking a skilled ETL Developer to join our technical team. This role is responsible for designing, developing, and maintaining robust ETL processes to support our data integration needs. You will work closely with data architects and analysts to ensure data quality and system efficiency across various platforms. Possible work locations for this role include Cochin and Bengaluru.
The ideal candidate will have over 4 years of experience with a strong background in Advanced SQL and various ETL tools. You will be expected to optimize database performance, troubleshoot complex data issues, and contribute to the continuous improvement of our data infrastructure. This is an excellent opportunity to work on scalable solutions in a high-growth environment focused on technological innovation and data excellence. No salary was mentioned in the original posting.
Key Requirements
Minimum of 4 years of professional experience in ETL development.
Proficiency in Advanced SQL for complex query writing and optimization.
Experience with data modeling and database design principles.
Proven ability to build and maintain scalable data pipelines.
Strong analytical and problem-solving skills for troubleshooting data issues.
Knowledge of performance tuning for ETL processes and database systems.
Familiarity with version control systems such as Git.
Excellent communication skills for collaborating with cross-functional teams.
Ability to document technical processes and data workflows clearly.
Experience working in an Agile development environment.
0 Negotiable or Not Mentioned
India
18 days ago
compugra.com
1105 Views
Compugra is urgently seeking a seasoned Java Full Stack Developer for a full-time engagement. This role is designed for a highly skilled professional with 7 to 10 years of experience who can hit the ground running in a fast-paced environment. The position offers the opportunity to work in dynamic technical hubs with possible work locations in Hyderabad and Bangalore.
The successful candidate will be responsible for end-to-end software development, involving both complex backend logic and responsive frontend design. You will work within a collaborative team to build scalable applications, maintain code quality, and implement modern software engineering practices. No specific salary was mentioned in the original posting.
Key Requirements
Minimum of 7 to 10 years of professional experience in full stack development.
In-depth knowledge of Java and the Spring Framework ecosystem.
Proficiency in frontend technologies such as React, Angular, or Vue.js.
Experience building and consuming RESTful web services.
Strong understanding of relational databases like MySQL, Oracle, or PostgreSQL.
Familiarity with Microservices architecture and containerization.
Proven experience with version control systems, specifically Git.
Ability to work effectively in an Agile/Scrum development environment.
Solid understanding of object-oriented programming principles and design patterns.
Excellent analytical, debugging, and problem-solving capabilities.
0 Negotiable or Not Mentioned
India
31 days ago
gspann.com
1642 Views
We are looking for an experienced GenAI Engineer to join the GSPANN team to work on cutting-edge generative technologies. This position is available in several key Indian locations including Gurugram, Hyderabad, and Pune. As a senior member of the team with over 6 years of experience, you will lead the development of innovative generative AI applications and frameworks that drive business value. The role requires deep expertise in Python, SQL, and MLOps to manage the entire lifecycle of generative models. You will be responsible for designing and implementing AI solutions, working closely with data scientists and software engineers to create impactful products. This is an opportunity to be at the forefront of the AI revolution and contribute to high-visibility projects within a global organization.
Key Requirements
At least 6 years of relevant experience in software or data engineering.
Expert-level proficiency in Python programming.
Strong command of SQL for data manipulation and analysis.
Extensive knowledge of Generative AI concepts and frameworks.
Proven experience with MLOps for model deployment and monitoring.
Solid background in Data Science methodologies.
General expertise in Artificial Intelligence and Machine Learning.
Experience with Large Language Models (LLMs) and fine-tuning.
Understanding of neural network architectures and transformers.
Ability to collaborate with cross-functional teams on complex projects.
0 Negotiable or Not Mentioned
India
6 days ago
huquo.com
412 Views
Huquo is currently looking for talented professionals for a General Analytics role based across India. The position is categorized under Band C1/C2 and is specifically looking for candidates who can join immediately. The role involves working with data analytics frameworks and requires a strong grasp of SQL and foundational concepts of Generative AI to provide meaningful insights and data-driven solutions for the organization.
Candidates should have over 4 years of relevant experience in the analytics field. You will be expected to utilize your analytical expertise to handle various data sets and contribute to the company's strategic goals. This is an excellent opportunity for experienced professionals to advance their careers in a fast-paced and innovative environment where data analytics is at the forefront of business operations.
Key Requirements
Minimum of 4 years of professional experience in analytics.
Proficiency in SQL for data manipulation and querying.
Basic understanding and knowledge of Generative AI (Gen AI).
Strong expertise in General Data Analytics and methodologies.
Ability to meet the criteria for professional bands C1 or C2.
Must be an immediate joiner or have a very short notice period.
Strong analytical and problem-solving capabilities.
Ability to work in a fast-paced corporate environment across India.
Experience with data visualization tools and reporting techniques.
Excellent communication skills to present data findings to stakeholders.
0 Negotiable or Not Mentioned
India, Hyderabad
20 days ago
infosys.com
1755 Views
Join our innovative team at Infosys as a Java Developer based in Hyderabad. In this role, you will be responsible for developing and maintaining high-quality software solutions using Java versions 8, 11, or 17. You will work extensively with Spring Boot and Microservices architecture to build scalable enterprise-level applications that meet our clients' complex needs. Your daily tasks will include designing and implementing robust REST APIs, managing SQL databases, and leveraging cloud platforms, specifically AWS, to ensure efficient deployment and high performance.
At Infosys, we foster a collaborative and innovative work culture where you can grow alongside industry leaders. We are looking for candidates with 3 to 8 years of professional experience who possess strong problem-solving skills and a proactive attitude. Experience in the banking or financial domain is highly desirable, as is knowledge of containerization tools like Docker and Kubernetes and familiarity with modern CI/CD pipelines. This is an exciting opportunity to work on large-scale systems and contribute to cutting-edge technology projects within a global IT leader.
Key Requirements
3–8 years of professional experience in software development
Proficiency in Java versions 8, 11, or 17
Hands-on experience with Spring Boot and Microservices architecture
Strong understanding of RESTful API design and implementation
Solid knowledge of SQL and relational database management
Experience working with cloud platforms, preferably AWS
Proven problem-solving and analytical skills
Prior experience in the Banking or Financial services domain
Familiarity with containerization technologies like Docker and Kubernetes
Understanding of CI/CD pipelines and DevOps best practices
0 Negotiable or Not Mentioned
India
11 days ago
lancesoft.in
691 Views
Lancesoft is seeking a highly skilled and experienced C++ Developer with specific expertise in Pro*C to join our dynamic technical team. In this role, you will be responsible for contributing to critical development projects and providing high-level support for our ongoing software initiatives. You will work on complex systems, ensuring performance and stability through diligent root cause analysis and proactive debugging of existing codebases.
The ideal candidate will have over six years of professional experience and be capable of handling enhancements and OM support activities efficiently. This position is open across multiple locations in India, offering a pan-India scope for qualified candidates. We are specifically looking for immediate joiners or those with a notice period of up to 15 days to fill these vital positions within our organization. The role involves close collaboration with various stakeholders to deliver high-quality technical solutions.
Key Requirements
Minimum of 6 years of professional experience in C++ development.
Proven hands-on expertise with Oracle Pro*C for database interaction.
Strong ability to perform root cause analysis and complex debugging of software issues.
Experience in developing and implementing system enhancements and new features.
Ability to support and collaborate effectively on OM (Operations and Maintenance) support activities.
Proficiency in SQL and working with relational database management systems.
Familiarity with Unix or Linux operating systems and shell scripting.
Experience with version control systems such as Git or SVN.
Solid understanding of software development life cycle (SDLC) and Agile methodologies.
Excellent problem-solving skills and the ability to work in a fast-paced environment.
0 Negotiable or Not Mentioned
India
31 days ago
symphonihr.com
1638 Views
Symphoni HR is seeking a passionate and skilled Java Full Stack Developer to contribute to the development of enterprise-grade systems. In this role, you will be responsible for building robust backend systems using Java and Spring Boot while designing scalable Microservices architectures. On the frontend, you will create dynamic and responsive user interfaces using Angular 6+. This position offers a unique opportunity to work on high-performance applications and collaborate with cross-functional teams including Product, QA, and DevOps to ensure high code quality and system performance. Possible work locations for this position include Bangalore and Mumbai. The ideal candidate should possess between 3 to 6 years of professional experience in software development and demonstrate a strong understanding of Data Structures and Object-Oriented Programming. Beyond the core technical stack, exposure to Docker, Kubernetes, Kafka, and cloud platforms like AWS or Azure is highly desirable. We are looking for engineers who care about writing clean, maintainable code and are ready to thrive in a fast-paced, collaborative environment. If you are a problem-solver dedicated to building impactful technology solutions, we encourage you to apply.
Key Requirements
Strong proficiency in Java programming language and its core ecosystem.
Hands-on experience in building backend applications using the Spring Boot framework.
Proven experience in designing and developing Microservices architecture.
Solid experience in building frontend applications using Angular (version 6 or higher).
Expertise in developing and integrating RESTful APIs for seamless communication.
Strong knowledge of relational databases and proficiency in writing SQL queries.
Deep understanding of Data Structures and Object-Oriented Programming principles.
Experience with containerization tools like Docker or Kubernetes is highly preferred.
Familiarity with messaging systems such as Apache Kafka is an added advantage.
Knowledge of cloud infrastructure and services like AWS or Microsoft Azure.
~140,000 Mentioned
India, Remote
6 days ago
talentquell.com
546 Views
We are seeking a highly skilled MLOps Engineer to spearhead our cross-cloud machine learning model migration initiatives, specifically moving from GCP to Azure Databricks. The successful candidate will be responsible for building and optimizing production-grade MLOps workflows and CI/CD pipelines while implementing MLflow for meticulous model tracking and lifecycle management. You will develop scalable pipelines using Databricks and PySpark, ensuring seamless data movement and high model reliability. The budget for this position is ₹1.4 LPM.
In this role, you will also focus on performance and cost optimization of machine learning infrastructures. You will work closely with Data Science teams to enable efficient model deployment and monitoring across cloud environments. This position operates on a UK time shift (8 AM – 5 PM) and offers a remote working arrangement within India. Candidates should have a strong foundation in data engineering and pipeline orchestration to succeed in this dynamic environment.
Key Requirements
Minimum 6–8 years of professional experience in MLOps or a related technical role.
Strong expertise in Databricks and PySpark for large-scale data processing.
Hands-on experience with MLflow for tracking experiments and managing model lifecycles.
Proven proficiency in CI/CD practices and workflows using tools like GitHub Actions.
Extensive experience working with cloud platforms, specifically Microsoft Azure and GCP.
Demonstrated ability to perform cross-cloud data movement and model migration tasks.
In-depth knowledge of model deployment strategies and continuous monitoring systems.
Strong background in data engineering and the orchestration of complex pipelines.
Excellent communication and collaboration skills for working with Data Science teams.
Availability to work according to the UK shift timings from 8 AM to 5 PM.
0 Negotiable or Not Mentioned
India, Hyderabad
16 days ago
xautomations.com
1003 Views
xautomations is looking for a part-time Data Modeler to join our team in Hyderabad. This role is focused on designing and maintaining efficient data structures that support our real-time systems and data pipelines. You will work on creating models that optimize data storage and retrieval for high-performance applications, ensuring data integrity and consistency across various platforms.
The position offers flexibility as a part-time role while providing the opportunity to work on complex, real-world data challenges within a professional engineering environment. You will collaborate with our engineering team to ensure that our data architecture is scalable and aligned with evolving business requirements. This is an office-based role located in Hyderabad.
Key Requirements
Strong background in data modeling techniques and methodologies.
Experience with relational and non-relational database design.
High proficiency in SQL for data manipulation and querying.
Understanding of data pipeline architectures and ETL processes.
Ability to create both logical and physical data models.
Knowledge of data warehousing concepts and star/snowflake schemas.
Experience using professional data modeling software and tools.
Collaborative mindset for working with data engineers and scientists.
Strong attention to detail regarding data governance and quality.
Effective communication skills to explain data structures to stakeholders.
0 Negotiable or Not Mentioned
India, Hyderabad
18 days ago
interaslabs.com
1086 Views
Interas Labs is currently seeking a highly skilled and experienced Java Backend Developer to join our dynamic team in Hyderabad. This is a full-time role offered on an onsite or hybrid basis, specifically looking for immediate joiners who can start within 0 to 7 days. The ideal candidate will have over four years of experience and a deep passion for building scalable, high-quality backend systems that drive modern applications using the latest Java versions. In this role, you will be responsible for developing robust microservices and RESTful APIs using Java 8, 11, or 17 and the Spring Boot framework. You will work closely with cross-functional teams in an Agile environment, leveraging cloud technologies such as AWS, Azure, or GCP. You will also manage database solutions across SQL and NoSQL platforms, ensuring data integrity and performance. Additional exposure to AI tools and frontend technologies like React or Angular is highly valued as we continue to innovate and expand our tech stack.
Key Requirements
At least 4 years of professional experience in backend development.
Strong expertise in Java versions 8, 11, or 17.
Proficiency in building microservices using the Spring Boot framework.
Extensive experience in designing and developing RESTful APIs.
Solid understanding of database management systems including SQL (PostgreSQL, MySQL) and NoSQL (MongoDB).
Hands-on experience with cloud platforms such as AWS, Azure, or Google Cloud Platform.
Practical knowledge of CI/CD pipelines and containerization tools like Docker and Kubernetes.
Proficiency in version control systems such as Git, GitHub, GitLab, or Bitbucket.
Deep understanding of security protocols and authentication methods including OAuth2, JWT, and SAML.
Experience working within Agile methodologies like Scrum or Kanban.
0 Negotiable or Not Mentioned
India, Remote
8 days ago
gmail.com
723 Views
CutoffDekho.com is seeking SQL Developer and Database Interns for a focused 1-month remote internship program. This role is ideal for those interested in the technical aspects of data management, offering the chance to work on live database projects with guidance from experienced mentors. The internship is 100% free and provides a solid foundation for a career in database administration or backend development.
Interns who successfully complete the program will be awarded a completion certificate and may be eligible for full-time job roles with a monthly compensation of ₹18K to ₹25K. This work-from-home position is part of a global initiative, welcoming applicants from regions including Afghanistan, the UAE, and multiple states across India.
Key Requirements
Foundational knowledge of SQL queries.
Understanding of relational database management systems.
Interest in data structures and database design.
Strong logical and analytical problem-solving skills.
Ability to commit to a 1-month live project schedule.
Willingness to work under the guidance of technical mentors.
Basic understanding of data normalization techniques.
Self-motivated to learn complex technical concepts.
Ability to troubleshoot simple database issues.
Capacity to work efficiently in a remote setting.
0 Negotiable or Not Mentioned
India
8 days ago
gmail.com
653 Views
SearchMate is partnering with an elite client to recruit a Senior SDET specialized in ETL environments and Backend Automation. This is a high-impact role aimed at Selection-Ready professionals capable of managing complex data pipelines and developing robust automation frameworks. The candidate will work under a hybrid model with office locations available in Chennai, Hyderabad, and Pune. The selected candidate will join the client's payroll for an initial duration of 6 months, with the possibility of extension based on performance and project needs. Responsibilities include ensuring the architectural integrity of massive data migrations and performing automated validation for a global enterprise. This role offers a strategic path into high-tier US product environments.
Key Requirements
7+ Years of overall IT experience in Quality Engineering.
Proven hands-on experience working as a QA Tester in ETL environments.
Strong proficiency in SQL.
Deep experience in RDBMS databases such as Oracle or SQL Server.
Hands-on expertise in Java-Selenium or Python automation testing.
Fluent in Agile workflows and cross-functional team collaboration.
Exceptional English communication skills for stakeholder interfacing.
Ability to bridge gaps between complex data pipelines and automation frameworks.
Experience in architectural integrity for massive data migrations.
Proactive mindset for high-tier US product environment standards.
0 Negotiable or Not Mentioned
India
50 days ago
cgi.com
1573 Views
CGI is looking for an experienced Automation Tester specializing in Selenium with C# or Java and API Automation Testing. The ideal candidate will have between 5 and 8 years of professional experience, with a strong emphasis on building robust testing frameworks and ensuring high software quality. This role requires working closely within Agile/Scrum teams and leveraging modern DevOps tools to streamline testing processes. Candidates who are immed
0 Negotiable or Not Mentioned
India, Hyderabad
17 days ago
interaslabs.com
859 Views
Interaslabs is seeking a highly skilled Senior Applications Analyst - I specialized in Supply Chain Management (SCM) modules for a hybrid role based in Hyderabad. The successful candidate will join a dynamic team focused on maintaining and enhancing Oracle E-Business Suite R12 environments. This role requires a unique blend of technical prowess and functional understanding to effectively support multi-organization structures and business processes across various distribution modules such as Order Management, Purchasing, Inventory, and Shipping.
The responsibilities include leading technical developments across RICEW components and ensuring the smooth execution of Order to Cash and Procure to Pay cycles. You will collaborate closely with business stakeholders to gather requirements, develop test plans, and manage third-party integrations. This position offers a significant opportunity to work on complex data conversions and full lifecycle implementations within a professional and collaborative atmosphere, requiring strong communication skills and a deep understanding of SQL and PL-SQL development.
Key Requirements
A proven techno-functional professional with at least one full lifecycle Oracle E-Business Suite R12 implementation or support experience.
Around 5 to 8 years of Oracle applications technical experience in Distribution modules like OM, PO, INV, and Shipping.
Must have exposure to Oracle EBS Order to Cash and Procure to Pay processes.
Implementation experience specifically within distribution modules is highly preferable.
Strong technical experience on SQL and PL-SQL is expected.
Proficiency in XML Publisher and BI Publisher for reporting and data presentation.
Development knowledge and hands-on experience in one or more RICEW components including forms, reports, and interfaces.
Experience in managing conversions, enhancements, and workflows within the Oracle ecosystem.
Candidate with a techno-functional exposure to multi-organization structures is highly desirable.
Experience in data conversions and integrations with third-party applications will be an added advantage.
0 Negotiable or Not Mentioned
India, Hyderabad
30 days ago
visiontek.co.in
1871 Views
Visiontek is looking for a skilled Oracle EBS R12 Technical Consultant to join our team in Hyderabad. The successful candidate will be responsible for managing and developing technical solutions within the Oracle E-Business Suite environment, specifically focusing on Order to Cash (O2C) and Procure to Pay (P2P) functional flows. This role involves working extensively with RICE components—Reports, Interfaces, Conversions, and Extensions—to ensure that business processes are supported by robust technical architecture. You will be expected to handle complex SQL queries, PL/SQL development, and the creation of alerts and triggers to maintain system integrity.
Beyond technical development, the role requires a consultant who can participate in end-to-end implementations and provide expert support in report generation and customization. Candidates with exposure to manufacturing business flows will have a significant advantage in this position. You will collaborate with functional teams to translate business requirements into technical specifications, ensuring that the Oracle EBS environment is optimized for performance and scalability. This is an excellent opportunity for a professional with 2 to 4 years of experience to take the next step in their career within a challenging and growth-oriented environment.
Key Requirements
2 to 4 years of hands-on experience as an Oracle EBS R12 Technical Consultant.
Proven experience with functional flows in Order to Cash (O2C) and Procure to Pay (P2P).
Strong expertise in RICE components (Reports, Interfaces, Conversions, Extensions).
Proficiency in developing and customizing Oracle Forms and Reports.
Deep technical knowledge of SQL, PL/SQL, Triggers, Alerts, and Joins.
Participation in at least one full end-to-end Oracle EBS implementation project.
Demonstrated ability in report generation and technical customization.
Prior exposure to manufacturing business flows is highly preferred.
Strong analytical and problem-solving skills for troubleshooting technical issues.
Excellent communication skills to work effectively with functional consultants and stakeholders.
0 Negotiable or Not Mentioned
India
51 days ago
ust.com
1660 Views
UST is currently seeking a highly skilled DBA Operations Engineer to join their dynamic and growing team. This role involves managing and optimizing database environments, specifically focusing on SQL and PostgreSQL systems. The successful candidate will leverage modern automation tools such as Terraform, Ansible, and GitOps to streamline operations and ensure high availability across infrastructure. This role is a critical part of the operations
0 Negotiable or Not Mentioned
India
31 days ago
hmwcomm.com
1738 Views
The Weekend F2F Hackathon Hiring Drive on March 21st, 2026, is seeking experienced Java Backend Developers to join their growing tech team. This fast-track hiring event is designed for professionals with 6 to 10 years of experience who are ready to showcase their real-world coding skills. Potential work locations for this role include Bangalore and Chennai, where candidates will participate in a competitive coding environment with the possibility of receiving immediate job offers.
In this role, you will be responsible for developing robust microservices using Java 17 and Spring Boot. You will implement modern design patterns and ensure system scalability through containerization tools like Docker and Kubernetes. The position requires a strong focus on quality through TDD/BDD practices and familiarity with cloud platforms such as AWS, GCP, or Azure. You will also manage data across various RDBMS including Oracle, SQL Server, and PostgreSQL.
Key Requirements
6 to 10 years of professional experience in Java development
Strong proficiency in Java 17 and Spring Boot frameworks
Proven experience with Microservices architecture and Design Patterns
Hands-on experience with containerization using Docker and Kubernetes
Working knowledge of Cloud platforms, preferably AWS, GCP, or Azure
Experience with unit testing frameworks like JUnit and Mockito
Familiarity with TDD or BDD development methodologies
Deep understanding of RDBMS such as Oracle, SQL Server, or PostgreSQL
Exposure to NoSQL databases and Kafka messaging systems
Ability to work in a fast-paced, high-pressure hackathon environment
Strong problem-solving skills and clean coding practices
0 Negotiable or Not Mentioned
India, Remote
25 days ago
ashratech.com
1884 Views
Ashratech is seeking a highly skilled Mainframe Microfocus Cobol Developer to join our specialized technical team. The ideal candidate will have over six years of professional experience and the ability to operate independently to create detailed designs and functional modules. This role requires a strong technical background in Microfocus Cobol, SQL, and Windows Scripting, along with a deep understanding of job schedulers to ensure efficient processing and system performance.
This position is offered as a remote role, though we are specifically looking for candidates associated with our regional hubs in Pune, Mumbai, Chennai, and Bangalore. As part of our team, you will be responsible for end-to-end development tasks, from initial design to implementation and maintenance. We value effective communication and a proactive approach to problem-solving, as you will be working on critical mainframe infrastructure that drives our business success.
Key Requirements
A minimum of 6 to 7 years of relevant experience in Mainframe development.
Expert level proficiency in Microfocus Cobol programming.
Strong hands-on experience with SQL for database operations.
Demonstrated expertise in Windows Scripting for automation.
Extensive knowledge and experience working with Job Scheduler tools.
Ability to work independently and take ownership of technical modules.
Proven track record in creating system designs and functional modules.
Effective communication skills for collaborating with technical and non-technical stakeholders.
Solid understanding of the Software Development Life Cycle (SDLC).
Strong analytical and troubleshooting skills within a mainframe environment.
0 Negotiable or Not Mentioned
India
10 days ago
ampstek.com
824 Views
Ampstek is currently seeking a highly skilled Automation Test Engineer with 5 to 10 years of experience to join our dynamic team. The ideal candidate will have extensive hands-on experience with Ericsson Mediation or Mediation Zone (MZ) and a strong background in test automation using Java or Python. This role is central to our quality assurance processes within the telecom domain, specifically focusing on CDR, billing, and mediation systems. Candidates will be expected to work with SQL and Unix/Linux environments to ensure the stability and efficiency of our technical solutions.
The positions are available in both Chennai and Pune, India, and we are looking for candidates who can join immediately or within a short notice period. In addition to technical proficiency in API testing and automation frameworks, experience with CI/CD tools like Jenkins or Azure DevOps and working within Agile/Scrum methodologies is highly desirable. This is an excellent opportunity for a professional looking to advance their career in a challenging and innovative telecommunications environment while working with cutting-edge mediation technologies.
Key Requirements
Hands-on experience with Ericsson Mediation / Mediation Zone (MZ)
Strong automation skills in Java or Python programming
Telecom domain experience (CDR, billing, mediation systems)
Good knowledge of SQL for database querying and testing
Proficiency in Unix/Linux operating systems and commands
Experience in API testing and working with automation frameworks
Familiarity with CI/CD tools such as Jenkins or Azure DevOps
Working experience in an Agile or Scrum environment
Exposure to ETL and Data testing processes
Ability to join immediately or within a short notice period
Strong analytical and problem-solving skills
Excellent communication and teamwork abilities
0 Negotiable or Not Mentioned
India
24 days ago
cgi.com
1257 Views
CGI is currently seeking a skilled Automation Tester to join their dynamic team in India, with potential work locations including Bangalore, Chennai, and Hyderabad. This role is specifically tailored for individuals with 3 to 6 years of experience, particularly within the banking domain. Candidates will be expected to work flexibly to align with European shifts, providing high-quality testing services across various banking platforms. Immediate joiners are highly preferred for this position.
The successful applicant will demonstrate expertise in Python, Java, and automation frameworks such as Selenium or Robot Framework. A strong understanding of SQL is essential, along with the ability to utilize modern AI-assisted development tools like GitHub Co-Pilot and Gemini to streamline testing processes. As an Automation Tester at CGI, you will play a crucial role in maintaining software integrity while adhering to a 30-day notice period. Interested candidates should provide their current and expected CTC along with their notice period when applying. Possible work locations include Bangalore, Chennai, and Hyderabad.
Key Requirements
Minimum of 3 to 6 years of experience in automation testing.
Strong proficiency in Python programming.
Strong proficiency in Java programming.
Experience with automation frameworks such as Selenium or Robot framework.
Proven background working within the banking domain.
Proficiency in writing and executing SQL queries for database testing.
Familiarity with AI-assisted development tools like GitHub Co-Pilot.
Familiarity with Gemini AI or similar productivity tools.
Willingness and flexibility to work during European shifts.
A notice period of no more than 30 days.
Strong analytical and problem-solving skills.
Excellent verbal and written communication skills in English.
0 Negotiable or Not Mentioned
India, Hyderabad
11 days ago
collectius.com
647 Views
We are hiring an MIS Manager to join our operations in the Madhapur district of Hyderabad. This role requires a data-driven professional who can manage complex information systems and provide critical insights to support our business objectives. Candidates should have extensive experience in reporting and data analysis, particularly within fintech, NBFCs, or collection agencies. In this role, you will be tasked with maintaining databases using SQL and performing advanced data analysis in Excel. Your primary responsibility will be to ensure the accuracy and timeliness of all management information reports. You will work in a collaborative environment where your technical expertise will directly contribute to the success of our collection and recovery strategies in India.
Key Requirements
7 to 9 years of experience in MIS or data management
Mandatory advanced proficiency in SQL for database queries
Mandatory expert-level skills in Microsoft Excel
Previous experience in Fintech or NBFC industries
Previous experience in Collection Agencies
Strong capability in creating automated reports and dashboards
Ability to clean, process, and validate large datasets efficiently
Experience with data visualization tools and methodologies
Excellent analytical and problem-solving abilities
Strong verbal and written communication skills for reporting findings
0 Negotiable or Not Mentioned
India, Hyderabad
15 days ago
edurungroup.in
861 Views
EduRun Group is actively seeking a highly skilled Senior System Engineer to join our client team in a hybrid capacity based in Hyderabad. The successful candidate will be responsible for leading architecture-level discussions and breaking down complex solutions into manageable technical requirements. You will be instrumental in creating detailed integration diagrams, sequence flows, and comprehensive system architecture models to ensure robust and scalable system performance across multiple dependencies.Beyond technical design, this role requires a professional who can drive requirements, clarify use cases, and engage directly with customers to provide high-quality technical solutions. You will oversee system validation efforts, mentor junior engineers, and review design documents to maintain high engineering standards. We are looking for someone with a strong background in REST APIs, JSON/XML, and system thinking who is comfortable working in a tech stack that includes Java, Node.js, or Python, alongside AWS and SQL databases. The role also offers the opportunity to participate in RFP solutioning and use modern AI tools to enhance productivity.
Key Requirements
Minimum of 7 to 10 years of professional experience in system engineering or a similar technical role.
Strong expertise in designing and working with REST APIs, JSON, and XML data formats.
Proven ability to independently create integration diagrams, sequence flows, and system architecture models.
Experience in documenting business requirements (BRD), functional requirements (FRD), and user stories.
Advanced system thinking skills with the ability to analyze data flows and multi-system behaviors.
Strong communication skills and experience in customer-facing roles to clarify use cases and requirements.
Ability to lead integration testing strategies and oversee overall system validation efforts.
Working knowledge of cloud technologies, with a strong preference for Amazon Web Services (AWS).
Solid understanding of microservices architecture and messaging systems like Kafka or RabbitMQ.
Proficiency in SQL database fundamentals and experience with tools like Postman and Swagger.
0 Negotiable or Not Mentioned
India, Hybrid
26 days ago
genixcyber.com
1684 Views
Genix Cyber India is looking for a dedicated Saviynt Support Professional with at least 5 years of experience to join their team. This hybrid role involves providing L2 and L3 support for Saviynt Identity Governance and Administration (IGA) solutions within complex enterprise environments. The successful candidate will be responsible for monitoring JML lifecycle workflows, onboarding enterprise applications via various connectors, and managing access reviews and certifications to ensure robust security posture. The position requires a deep technical understanding of IGA concepts and strong skills in SQL and Web Services. You will collaborate closely with clients and stakeholders to troubleshoot issues and drive optimization and performance tuning. This is a great opportunity for candidates with a strong background in Identity Access Management (IAM) to work with modern tools like Workday and Azure AD while solving challenging production incidents in a dynamic security landscape.
Key Requirements
5+ years of hands-on experience in Saviynt production support and troubleshooting.
Deep understanding of IGA concepts, provisioning, and reconciliation.
Proven experience in monitoring and maintaining Joiner–Mover–Leaver (JML) lifecycle workflows.
Expertise in onboarding enterprise applications using REST, JDBC, AD, and SAP connectors.
Ability to manage complex Access Reviews, Certifications, and Campaigns.
Strong troubleshooting skills for workflows, roles, approval processes, and system rules.
Proficiency in SQL and Web Services including REST, JSON, SCIM, and SOAP.
Hands-on experience with integrations involving Workday, Active Directory, and Azure AD.
Experience handling high-priority production incidents and change requests.
Ability to drive performance tuning and incident resolution activities in a hybrid environment.