0 Negotiable or Not Mentioned
India, Remote
56 days ago
myskysys.com
2871 Views
The Senior Data Operations Engineer role is a 100% remote full-time contract position designed for candidates located in India. The successful candidate will collaborate closely with business partners to establish system connectivity and oversee the entire lifecycle of data pipelines, from initial collection to the final deployment of data models. This role requires a professional who can monitor pipeline performance, resolve bottlenecks, and implement process improvements through automation to enhance data integrity across various products. Key responsibilities include performing end-to-end unit testing and code reviews while advocating for a strong culture of process and data quality. The candidate must be comfortable presenting to large groups and working within an agile framework. With a contract duration of over 12 months and the possibility of extension, this role offers a stable opportunity for an expert in data warehousing and cloud optimization to contribute to high-visibility projects.
Key Requirements
Bachelor's degree in Computer Science or Engineering or equivalent experience.
At least five years of relevant experience in a data role working with warehouses.
Proficiency with SQL skills and database management techniques.
Experience with modern Extract/Load/Transform (ELT) orchestration tools.
Hands-on experience with Azure Data Factory or Apache Airflow.
Familiarity with cloud services such as AWS, Azure, or Google Cloud.
Understanding of data warehousing solutions specifically Snowflake.
Experience with git and git-based development workflows.
Knowledge of data modeling, data warehousing, and architecture principles.
Proven track record in optimization and cost savings within cloud environments.
Strong communication skills to convey complex data issues to stakeholders.
Ability to work in an agile development methodology environment.
0 Negotiable or Not Mentioned
India, Remote
18 days ago
idtsolution.in
1823 Views
We are looking for a skilled and passionate Data Engineer to join our advanced analytics team. The ideal candidate will have hands-on experience in IICS, Snowflake, SQL, and Cloud platforms, with a strong foundation in building scalable data pipelines and modern data integration frameworks. You will play a key role in developing data-driven solutions to support AI/ML initiatives and enhance customer experience across global operations. The role requires working from 12:00 PM to 9:00 PM IST to align with team requirements.
Your core responsibilities will include designing ETL/ELT processes and managing cloud-based data solutions on platforms like AWS, Azure, or GCP. You will handle both structured and semi-structured data while implementing essential data quality and monitoring processes. Collaboration with cross-functional teams is vital to support business growth initiatives. The package for this role is up to ₹14 LPA, reflecting the expertise required to handle complex data architectures and DevOps practices within a modern data stack environment.
Key Requirements
2+ years of experience in Data Engineering.
Strong hands-on experience with Snowflake.
Proficiency in Cloud platforms (AWS, Azure, or GCP).
Expert knowledge of SQL and Python.
Extensive experience with Informatica Cloud (IICS) and ETL tools.
Proven ability to build scalable data pipelines and cloud-based data solutions.
Knowledge of serverless architectures and APIs.
Familiarity with DevOps practices including CI/CD and IaC.
Experience with streaming tools like Kafka or Spark Streaming.
Bachelor’s degree in a relevant field such as Computer Science or Engineering.
0 Negotiable or Not Mentioned
India, Remote
14 days ago
sapphiresoftwaresolutions.com
1332 Views
We are seeking a skilled Data Engineer to join a fast-growing team supporting major global brands like KFC, Pizza Hut, and Taco Bell. This is a fully remote role based in India, with a shift schedule of 12 PM to 9 PM IST. The initial contract duration is three months, with a high probability of extension based on performance and project needs. You will be responsible for building and optimizing data pipelines using Informatica IICS and Snowflake, focusing on scalable data integration frameworks within an AWS cloud environment. The ideal candidate should have at least 2 years of experience in data engineering, with strong technical skills in Python scripting, SQL, and event-driven architectures. You will work on impactful global projects, supporting advanced analytics and AI/ML initiatives while collaborating with dedicated DevOps teams. Exposure to Airflow and streaming pipelines such as Kafka or AWS Streaming is highly desirable. This is an excellent opportunity to work on modern data platforms and drive data-driven decisions for world-class organizations.
Key Requirements
2+ years of professional Data Engineering experience.
Proficiency in Informatica Cloud (IICS) as the primary ETL tool.
Hands-on experience with Snowflake as a target data platform.
Strong expertise in AWS (Amazon Web Services) cloud environment.
Advanced knowledge of SQL for complex data queries and manipulation.
Solid programming skills in Python for scripting and automation.
Experience handling both structured and semi-structured data formats.
Familiarity with REST APIs and AWS Lambda functions.
Ability to work the 12 PM to 9 PM IST shift.
Capacity to collaborate effectively with cross-functional teams and DevOps.
0 Negotiable or Not Mentioned
India
31 days ago
gspann.com
1835 Views
GSPANN is currently seeking five dedicated AI Ops Engineers to join our team across several locations in India, including Gurugram, Hyderabad, Pune, and Delhi NCR. This role focuses on optimizing AI operations and ensuring the seamless integration of machine learning models into production environments through robust automation and monitoring strategies. As part of a forward-thinking team, you will play a crucial role in maintaining the reliability and efficiency of our AI infrastructure to support large-scale enterprise deployments. Candidates should have at least 3 years of relevant experience and demonstrate a strong background in SQL, Python, and major cloud platforms like AWS, GCP, or Azure. You will work extensively with DevOps methodologies and CI/CD pipelines to build impactful AI-driven solutions that solve complex business challenges. The ideal candidate is someone who is passionate about the intersection of artificial intelligence and systems engineering and is eager to build something impactful together.
Key Requirements
Minimum of 3 years of professional experience in technical roles.
Proficiency in AIOps methodologies and operational frameworks.
Strong database management skills using SQL.
Advanced programming abilities in Python.
Hands-on experience with Cloud Platforms including AWS, GCP, or Azure.
Solid understanding of DevOps practices and principles.
Expertise in designing and implementing CI/CD pipelines.
Experience with system monitoring and observability tools.
Familiarity with containerization technologies such as Docker or Kubernetes.
Strong analytical and troubleshooting skills for production systems.
0 Negotiable or Not Mentioned
India, Remote
6 days ago
tekvo.io
368 Views
Tekvo is looking for a seasoned and highly motivated Azure Technical Lead to spearhead our cloud data engineering projects. In this critical role, you will be the driving force behind large-scale Azure analytics initiatives, overseeing the end-to-end development of high-impact data platforms. You will be responsible for defining the solution design, ensuring technical excellence across the delivery lifecycle, and providing strategic guidance to engineering teams. Your expertise will directly contribute to the creation of scalable, robust, and efficient data architectures that empower our clients to make data-driven decisions.
As an Azure Technical Lead, you must demonstrate mastery over core Azure data services such as Data Factory, Synapse, and Databricks. The role demands a blend of deep technical proficiency in SQL and Python along with the leadership skills required to mentor developers and manage complex stakeholder expectations. This remote position offers a unique opportunity for professionals based in India to work on cutting-edge cloud technologies within a collaborative environment. Successful candidates will be expected to maintain high standards of code quality and architectural integrity while driving innovation in the data engineering space.
Key Requirements
Possess 10-14 years of professional experience in data engineering and cloud platforms.
Demonstrate expert-level proficiency in designing and implementing Azure Data Factory pipelines.
Have hands-on experience with Azure Synapse Analytics for enterprise data warehousing.
Show strong technical expertise in using Azure Databricks for big data processing.
Maintain advanced knowledge of SQL for complex data manipulation and performance tuning.
Exhibit proficiency in Python programming for automating data workflows and engineering tasks.
Prove a track record of leading and delivering large-scale analytics initiatives on Azure.
Possess strong solution design skills with the ability to create scalable data architectures.
Demonstrate the ability to guide, mentor, and manage high-performing technical teams.
Experience in cloud security best practices and data governance frameworks is highly preferred.
Excellent communication skills to interact with stakeholders and translate business needs into technical solutions.
0 Negotiable or Not Mentioned
India
8 days ago
gmail.com
487 Views
SearchMate is partnering with an elite client to recruit a Senior SDET specialized in ETL environments and Backend Automation. This is a high-impact role aimed at Selection-Ready professionals capable of managing complex data pipelines and developing robust automation frameworks. The candidate will work under a hybrid model with office locations available in Chennai, Hyderabad, and Pune. The selected candidate will join the client's payroll for an initial duration of 6 months, with the possibility of extension based on performance and project needs. Responsibilities include ensuring the architectural integrity of massive data migrations and performing automated validation for a global enterprise. This role offers a strategic path into high-tier US product environments.
Key Requirements
7+ Years of overall IT experience in Quality Engineering.
Proven hands-on experience working as a QA Tester in ETL environments.
Strong proficiency in SQL.
Deep experience in RDBMS databases such as Oracle or SQL Server.
Hands-on expertise in Java-Selenium or Python automation testing.
Fluent in Agile workflows and cross-functional team collaboration.
Exceptional English communication skills for stakeholder interfacing.
Ability to bridge gaps between complex data pipelines and automation frameworks.
Experience in architectural integrity for massive data migrations.
Proactive mindset for high-tier US product environment standards.
0 Negotiable or Not Mentioned
India
10 days ago
ampstek.com
752 Views
Ampstek is currently seeking a highly skilled Automation Test Engineer with 5 to 10 years of experience to join our dynamic team. The ideal candidate will have extensive hands-on experience with Ericsson Mediation or Mediation Zone (MZ) and a strong background in test automation using Java or Python. This role is central to our quality assurance processes within the telecom domain, specifically focusing on CDR, billing, and mediation systems. Candidates will be expected to work with SQL and Unix/Linux environments to ensure the stability and efficiency of our technical solutions.
The positions are available in both Chennai and Pune, India, and we are looking for candidates who can join immediately or within a short notice period. In addition to technical proficiency in API testing and automation frameworks, experience with CI/CD tools like Jenkins or Azure DevOps and working within Agile/Scrum methodologies is highly desirable. This is an excellent opportunity for a professional looking to advance their career in a challenging and innovative telecommunications environment while working with cutting-edge mediation technologies.
Key Requirements
Hands-on experience with Ericsson Mediation / Mediation Zone (MZ)
Strong automation skills in Java or Python programming
Telecom domain experience (CDR, billing, mediation systems)
Good knowledge of SQL for database querying and testing
Proficiency in Unix/Linux operating systems and commands
Experience in API testing and working with automation frameworks
Familiarity with CI/CD tools such as Jenkins or Azure DevOps
Working experience in an Agile or Scrum environment
Exposure to ETL and Data testing processes
Ability to join immediately or within a short notice period
Strong analytical and problem-solving skills
Excellent communication and teamwork abilities
0 Negotiable or Not Mentioned
India
56 days ago
ashudividend.com
551 Views
As a Databricks Developer at Ashu Dividend, you will be a key member of our data engineering team, responsible for designing, building, and maintaining robust and scalable data pipelines. This role focuses on leveraging Azure Databricks and PySpark to drive large-scale data projects, ensuring that our data infrastructure is efficient and reliable. You will work closely with cross-functional teams to translate business requirements into technical
0 Negotiable or Not Mentioned
India, Hyderabad
55 days ago
ust.com
545 Views
UST is seeking highly skilled professionals for the role of SQL and AWS Scripting Professional to join our dynamic team in Hyderabad. In this role, you will be responsible for managing and optimizing complex database environments, leveraging AWS services like Aurora and utilizing Terraform for infrastructure management. You will play a critical role in ensuring database performance and reliability, while using scripting languages like Python and
0 Negotiable or Not Mentioned
India
16 days ago
nishtechnologies.com
936 Views
Nish Technologies is seeking a skilled Data Engineer to join a prestigious Big4 MNC client on a full-time basis. The ideal candidate will have between 5 and 8 years of professional experience in data engineering and will be responsible for designing and implementing efficient data solutions. The primary work locations for this role are Hyderabad and Bangalore. This position requires a strong technical background and the ability to work in a fast-paced, high-impact environment.
Candidates must be proficient in Python, SQL, and PySpark to handle complex data sets and pipelines. We are conducting a virtual weekend recruitment drive on Saturday, April 4th. This role is intended for immediate joiners or those with a notice period of up to 15 days. Interested professionals are encouraged to share their profiles for consideration in this expedited hiring process.
Key Requirements
5 to 8 years of relevant experience in Data Engineering.
Advanced proficiency in Python programming for data processing.
Expertise in writing complex SQL queries for database management.
In-depth knowledge of PySpark and its application in big data projects.
Ability to join immediately or within a maximum notice period of 15 days.
Experience working within a Big4 MNC or similar large-scale environment.
Strong analytical skills to solve complex data-related problems.
Familiarity with ETL processes and data pipeline orchestration.
Excellent communication skills for cross-functional team collaboration.
Availability to participate in the virtual weekend recruitment drive on April 4th.
0 Negotiable or Not Mentioned
India, Remote
7 days ago
allruva.com
818 Views
Allruva is looking for a seasoned Salesforce Developer to join our team for enterprise-level Salesforce implementations. This role is remote-based for candidates in India, following an EST shift schedule (4:00 PM – 1:00 AM IST). The ideal candidate will be responsible for designing and developing scalable solutions using the full Salesforce stack, including Apex, Lightning Web Components (LWC), and Aura components. You will play a critical role in building integrations with external systems via REST/SOAP APIs and automating complex business processes to ensure platform efficiency.
Beyond core development, you will collaborate closely with product, architecture, and QA teams in a dynamic Agile/Scrum environment. You will be expected to maintain high coding standards through unit testing and peer code reviews. We are looking for a professional with over 7 years of experience who can thrive in a fast-paced setting, deliver high-quality technical documentation, and support deployment cycles using CI/CD tools. This position offers an opportunity to work on large-scale enterprise projects within a supportive technical framework.
Key Requirements
Minimum of 7 years of professional experience in Salesforce development.
Advanced proficiency in Lightning Web Components (LWC) and Aura components.
Expertise in Salesforce backend development including Apex, Triggers, and Visualforce.
Hands-on experience building and maintaining Salesforce integrations using REST/SOAP APIs and OAuth.
Strong understanding of Salesforce security models, including roles, profiles, and permissions.
Proven experience with Salesforce automation tools like Flows, Process Builder, and Workflows.
Familiarity with CI/CD tools and DevOps practices specifically for the Salesforce platform.
Ability to work during EST shifts (4:00 PM – 1:00 AM IST).
Salesforce Platform Developer II Certification is highly preferred.
Strong technical documentation skills and experience with unit testing and code reviews.
0 Negotiable or Not Mentioned
India, Hyderabad
27 days ago
mysbscorp.com
1833 Views
We are seeking a highly skilled DevOps Engineer to join our dynamic team, focusing on Ansible and Terraform to automate AWS EC2 builds. The ideal candidate will be responsible for creating robust, scalable infrastructure-as-code solutions and streamlining our cloud deployment workflows to ensure efficiency and reliability. This role is crucial for our ongoing digital transformation and infrastructure optimization projects.
The position is available on a full-time or contract basis, with a preference for candidates who can work onsite in Hyderabad, India, although remote arrangements are also being considered. You will collaborate closely with development and operations teams to maintain a high-performance AWS environment. If you have a passion for automation and cloud infrastructure, we encourage you to apply with your updated resume.
Key Requirements
Proficiency in Ansible for configuration management and application deployment.
Extensive experience with Terraform for provisioning infrastructure as code.
Proven ability to automate AWS EC2 instance builds and management.
Strong understanding of AWS core services and cloud architecture.
Experience with Linux/Unix system administration and shell scripting.
Familiarity with CI/CD tools and pipeline integration processes.
Ability to work effectively in both onsite and remote environments.
Strong problem-solving skills and attention to detail in infrastructure tasks.
Excellent communication skills for collaborating with cross-functional teams.
Experience with version control systems, particularly Git.
0 Negotiable or Not Mentioned
India
20 days ago
pyxidiatech.com
1237 Views
We are seeking a highly skilled Senior Data Analyst to join our dynamic team and play a pivotal role in designing, building, and scaling innovative data solutions across our products and client implementations. The successful candidate will be responsible for developing scalable data pipelines, optimizing ETL workflows, and ensuring the highest standards of data quality and reliability. You will work closely with cross-functional teams, including Product, Engineering, and Data Science, to drive data architecture decisions and deliver actionable insights that solve complex business challenges. This role offers the opportunity to provide technical guidance and mentorship to junior analysts while working on impactful, data-driven projects. This position is available in multiple locations across India, specifically Mumbai, Bangalore, and Pune. The ideal candidate will have over four years of experience in data-focused roles and a deep understanding of the AWS ecosystem, including Redshift, Athena, and EMR. Experience with US Healthcare Data is considered a significant advantage. Candidates must be proficient in Python, PySpark, and SQL, and possess a strong grasp of data modeling and performance optimization. If you are passionate about big data and looking to make a significant impact in a collaborative environment, we encourage you to apply.
Key Requirements
Minimum of 4 years of professional experience in Data Analytics or Data Engineering.
Demonstrated expertise in SQL and both relational and NoSQL database management.
Hands-on proficiency with Python and PySpark for processing large-scale datasets.
Proven experience in building and optimizing ETL/ELT pipelines using Airflow or AWS Glue.
Strong conceptual understanding of data modeling and performance tuning.
Advanced technical knowledge of AWS data services including S3, EMR, Redshift, and Athena.
Ability to design and manage complex data architectures across structured and unstructured sources.
Competency in maintaining high standards for data quality, validation, and monitoring.
Strategic thinking skills to perform analysis and generate actionable business insights.
Strong collaborative skills to work effectively with Product, Engineering, and Client teams.
0 Negotiable or Not Mentioned
India
14 days ago
sysmind.com
1178 Views
We are looking for an experienced Senior Power BI Developer to lead the design and maintenance of critical dashboards and reports for our business operations. This position is available in both Hyderabad and Pune as a contract-to-hire opportunity. The role involves working closely with stakeholders to understand their data needs and delivering high-quality analytical solutions that drive decision-making through intuitive data visualization and robust reporting structures.
The candidate must demonstrate strong expertise in data modeling, ETL processes, and writing optimized SQL, DAX, and Power Query (M) code. Practical experience with cloud data integration tools such as Informatica IICS, Snowflake, and Azure Blob storage is highly preferred. You will be responsible for ensuring the integrity and performance of the data pipeline, from source systems to final visual reports, while collaborating with cross-functional teams in a fast-paced environment.
Key Requirements
6–8 years of professional experience as a Power BI Developer or in a similar Business Intelligence role.
Advanced proficiency in designing and maintaining complex Power BI dashboards and reports.
Strong expertise in data modeling and creating efficient relationships between diverse data sets.
Proven ability to write and optimize complex DAX queries for advanced calculations.
Deep understanding of Power Query (M) for data transformation and cleaning processes.
Excellent SQL skills for data extraction, manipulation, and performance tuning.
Hands-on experience with ETL tools, specifically Informatica IICS, for data integration.
Familiarity with cloud data warehousing solutions such as Snowflake.
Knowledge of Azure cloud services, including Azure Blob storage for data management.
Strong analytical and problem-solving skills with the ability to work in a contract-to-hire capacity.
0 Negotiable or Not Mentioned
India, Pan India
27 days ago
ampcustech.com
1453 Views
Ampcustech is currently seeking a highly skilled GenAI Solution Architect to join our innovative team in India on a hybrid basis. The ideal candidate will be responsible for designing and implementing scalable, enterprise-grade AI-driven solutions utilizing Large Language Models (LLMs) and various cloud platforms. You will play a pivotal role in bridging the gap between cutting-edge AI research and production-ready applications, working closely with both engineering and product management teams to deliver high-impact results. The role involves building diverse use cases including advanced document processing, automated summarization, intelligent Q&A systems, AI copilots, and workflow automation. You will define technical reference architectures for LLMs, vector stores, and orchestration layers, ensuring that all GenAI services are seamlessly integrated into existing data platforms. Furthermore, you will lead efforts in prompt engineering and optimization while maintaining a strong focus on security, privacy, and responsible AI ethics. This is an exciting opportunity to drive innovation in the healthcare domain and beyond.
Key Requirements
4–7 years of experience in software, data, or cloud architecture roles.
Hands-on experience with Generative AI and LLM-based solutions.
Proficiency with Azure OpenAI, OpenAI APIs, and LangChain.
Experience with Semantic Kernel or similar orchestration frameworks.
Strong understanding of REST APIs and cloud-based architectures.
Ability to collaborate effectively with engineering and product teams.
Strong communication skills to explain AI concepts to non-technical stakeholders.
Proficiency in prompt engineering, evaluation, and optimization techniques.
Experience defining reference architectures for vector stores and orchestration layers.
Commitment to ensuring security, privacy, and responsible AI practices.
0 Negotiable or Not Mentioned
India
20 days ago
se-mentor.com
1145 Views
We are looking for an experienced Databricks Engineer to enhance our data engineering capabilities at SE Mentor Solutions. In this role, you will be leveraging Azure services, PySpark, and Databricks to build high-performance data frameworks. Your contributions will help drive our data-driven decision-making processes by ensuring the reliability and scalability of our cloud-based data systems. This position offers the flexibility to work from either Cochin or Bengaluru.
Candidates should possess at least 5 years of experience in the data engineering field with a focus on Azure technologies. You will be responsible for designing and implementing efficient data processing logic and managing complex data architectures within the Azure cloud environment. As part of our team, you will participate in the full lifecycle of data projects, from requirements gathering to deployment and monitoring. Salary information was not provided in the original job post.
Key Requirements
Over 5 years of experience in data engineering or related technical roles.
Strong expertise in SQL and PySpark for data processing tasks.
Hands-on experience with Azure Databricks and Azure Data Factory.
Familiarity with Azure Cloud infrastructure and related services.
Experience in building and optimizing large-scale data architectures.
Ability to design and implement automated data pipelines.
Understanding of data warehousing concepts and technologies.
Strong troubleshooting skills for cloud-based data environments.
Effective teamwork and communication skills.
Commitment to maintaining high standards of data security and integrity.
0 Negotiable or Not Mentioned
India
50 days ago
cgi.com
524 Views
CGI is looking for an experienced Automation Tester specializing in Selenium with C# or Java and API Automation Testing. The ideal candidate will have between 5 and 8 years of professional experience, with a strong emphasis on building robust testing frameworks and ensuring high software quality. This role requires working closely within Agile/Scrum teams and leveraging modern DevOps tools to streamline testing processes. Candidates who are immed
0 Negotiable or Not Mentioned
India
51 days ago
ust.com
525 Views
UST is currently seeking a highly skilled DBA Operations Engineer to join their dynamic and growing team. This role involves managing and optimizing database environments, specifically focusing on SQL and PostgreSQL systems. The successful candidate will leverage modern automation tools such as Terraform, Ansible, and GitOps to streamline operations and ensure high availability across infrastructure. This role is a critical part of the operations
~166,666 Mentioned
India
56 days ago
hyptechie.co.in
550 Views
Hyptechie is seeking an experienced Kubernetes Platform Technical Lead to lead and drive cloud-native infrastructure initiatives. The successful candidate will be instrumental in architecting and managing scalable platforms, ensuring that our infrastructure supports high-performance microservices and cloud-native applications. This role requires a visionary leader who can bridge the gap between development and operations to foster a robust DevOps
0 Negotiable or Not Mentioned
India
53 days ago
ust.com
536 Views
UST is seeking experienced Infrastructure Engineers to join our growing global team. This role is focused on managing and optimizing enterprise-level infrastructure with a focus on Wintel systems, Windows Server environments, and VMware virtualization. The successful candidate will work closely with cross-functional teams to ensure high availability, security, and performance of our IT systems. This position offers an opportunity to work on cutti
0 Negotiable or Not Mentioned
India
58 days ago
macronix.in
560 Views
Macronix is seeking a highly skilled and experienced Senior Java Developer to join our dynamic and growing technical team. This role is designed for a professional with over seven years of hands-on experience in Java technologies, capable of delivering high-quality software solutions. The position offers a hybrid work model, providing flexibility for candidates located near our primary hubs. We are looking for individuals who are passionate about
0 Negotiable or Not Mentioned
India
51 days ago
wavicledata.com
526 Views
Wavicle Data Solutions is currently looking for a talented and experienced Azure DevOps Engineer to join our growing team. This role offers the opportunity to work in a collaborative environment where you will leverage your technical skills to optimize cloud infrastructure and deployment processes. We are hiring for multiple locations across India, specifically in Chennai, Coimbatore, and Bangalore. The ideal candidate will have between 6 to 9 ye
0 Negotiable or Not Mentioned
India
24 days ago
cgi.com
1309 Views
CGI is currently seeking a skilled Automation Tester to join their dynamic team in India, with potential work locations including Bangalore, Chennai, and Hyderabad. This role is specifically tailored for individuals with 3 to 6 years of experience, particularly within the banking domain. Candidates will be expected to work flexibly to align with European shifts, providing high-quality testing services across various banking platforms. Immediate joiners are highly preferred for this position.
The successful applicant will demonstrate expertise in Python, Java, and automation frameworks such as Selenium or Robot Framework. A strong understanding of SQL is essential, along with the ability to utilize modern AI-assisted development tools like GitHub Co-Pilot and Gemini to streamline testing processes. As an Automation Tester at CGI, you will play a crucial role in maintaining software integrity while adhering to a 30-day notice period. Interested candidates should provide their current and expected CTC along with their notice period when applying. Possible work locations include Bangalore, Chennai, and Hyderabad.
Key Requirements
Minimum of 3 to 6 years of experience in automation testing.
Strong proficiency in Python programming.
Strong proficiency in Java programming.
Experience with automation frameworks such as Selenium or Robot framework.
Proven background working within the banking domain.
Proficiency in writing and executing SQL queries for database testing.
Familiarity with AI-assisted development tools like GitHub Co-Pilot.
Familiarity with Gemini AI or similar productivity tools.
Willingness and flexibility to work during European shifts.
A notice period of no more than 30 days.
Strong analytical and problem-solving skills.
Excellent verbal and written communication skills in English.
0 Negotiable or Not Mentioned
India
18 days ago
compugra.com
1105 Views
Compugra is urgently seeking a seasoned Java Full Stack Developer for a full-time engagement. This role is designed for a highly skilled professional with 7 to 10 years of experience who can hit the ground running in a fast-paced environment. The position offers the opportunity to work in dynamic technical hubs with possible work locations in Hyderabad and Bangalore.
The successful candidate will be responsible for end-to-end software development, involving both complex backend logic and responsive frontend design. You will work within a collaborative team to build scalable applications, maintain code quality, and implement modern software engineering practices. No specific salary was mentioned in the original posting.
Key Requirements
Minimum of 7 to 10 years of professional experience in full stack development.
In-depth knowledge of Java and the Spring Framework ecosystem.
Proficiency in frontend technologies such as React, Angular, or Vue.js.
Experience building and consuming RESTful web services.
Strong understanding of relational databases like MySQL, Oracle, or PostgreSQL.
Familiarity with Microservices architecture and containerization.
Proven experience with version control systems, specifically Git.
Ability to work effectively in an Agile/Scrum development environment.
Solid understanding of object-oriented programming principles and design patterns.
Excellent analytical, debugging, and problem-solving capabilities.
0 Negotiable or Not Mentioned
India
31 days ago
gspann.com
1298 Views
GSPANN is hiring an AI Developer with over 5 years of experience to contribute to our sophisticated artificial intelligence projects. The role is based in India, with offices located in Gurugram, Hyderabad, and Pune. You will be at the forefront of developing AI/ML solutions and managing complex data processes through efficient ETL and ELT pipelines. Candidates must be proficient in Azure AI Services, Python, and modern data integration processes. You will collaborate with cross-functional teams to build and deploy intelligent software systems, ensuring high performance and scalability of AI-driven applications. This role is ideal for a developer who enjoys solving difficult problems and wants to work on the latest technologies in the cloud AI space.
Key Requirements
Minimum of 5 years of experience in AI or software development.
In-depth knowledge of Azure AI Services and cloud environments.
Strong proficiency in Python development.
Extensive experience with ETL and ELT data processes.
Solid understanding of AI/ML algorithms and their application.
Experience in designing and maintaining scalable data pipelines.
Ability to work with both structured and unstructured data sources.
Familiarity with software development lifecycles (SDLC).
Strong problem-solving skills and attention to detail.
Excellent communication and teamwork capabilities.
0 Negotiable or Not Mentioned
India
31 days ago
gspann.com
1641 Views
We are looking for an experienced GenAI Engineer to join the GSPANN team to work on cutting-edge generative technologies. This position is available in several key Indian locations including Gurugram, Hyderabad, and Pune. As a senior member of the team with over 6 years of experience, you will lead the development of innovative generative AI applications and frameworks that drive business value. The role requires deep expertise in Python, SQL, and MLOps to manage the entire lifecycle of generative models. You will be responsible for designing and implementing AI solutions, working closely with data scientists and software engineers to create impactful products. This is an opportunity to be at the forefront of the AI revolution and contribute to high-visibility projects within a global organization.
Key Requirements
At least 6 years of relevant experience in software or data engineering.
Expert-level proficiency in Python programming.
Strong command of SQL for data manipulation and analysis.
Extensive knowledge of Generative AI concepts and frameworks.
Proven experience with MLOps for model deployment and monitoring.
Solid background in Data Science methodologies.
General expertise in Artificial Intelligence and Machine Learning.
Experience with Large Language Models (LLMs) and fine-tuning.
Understanding of neural network architectures and transformers.
Ability to collaborate with cross-functional teams on complex projects.
0 Negotiable or Not Mentioned
India, Hyderabad
11 days ago
riscrm.com
773 Views
We are seeking a proactive and quick-thinking DevOps Engineer to join our team in Hyderabad. The ideal candidate will have over five years of hands-on experience and be ready to join immediately, ideally within 10 to 15 days. You will be responsible for managing and optimizing cloud infrastructures, specifically focusing on Google Cloud Platform services and Salesforce integrations. This role is perfect for someone who thrives in fast-paced environments and enjoys the challenge of complex system integrations.
In this position, you will utilize a wide array of tools including Kubernetes, Docker, and various CI/CD technologies like Jenkins and GitHub Actions. You will also be expected to work with middleware such as Kafka and Spring Boot while ensuring high code quality through SonarQube. Your expertise in scripting with Python or Bash and your understanding of RESTful APIs will be critical in automating workflows and maintaining a robust technical environment. This role offers an opportunity to work at the forefront of IT infrastructure in India.
Key Requirements
Minimum of 5 years of professional experience in DevOps engineering.
Hands-on expertise with Google Cloud Platform (GCP) including Cloud Run, Pub/Sub, Vertex AI, and IAM.
Proven experience with Salesforce environments using SFDX CLI, Apex, and LWC.
Strong proficiency in containerization and orchestration using Docker and Kubernetes.
Extensive knowledge of middleware technologies such as Kafka, Spring Boot, and Mulesoft.
Deep experience in setting up and maintaining CI/CD pipelines with GitHub Actions, Jenkins, and Maven.
Familiarity with code quality and artifact management tools like SonarQube and JFrog Artifactory.
Advanced scripting skills in Python, Bash, and Groovy for automation tasks.
Solid understanding of API architectures including REST, JSON, and GraphQL.
Ability to join the organization as an immediate joiner within 10 to 15 days.
Strong analytical and problem-solving skills to handle complex technical integrations.
0 Negotiable or Not Mentioned
India
9 days ago
ust.com
566 Views
UST is currently seeking experienced Testers to join our dynamic team across various locations in India, including Pune, Chennai, Bangalore, Trivandrum, Kochi, Hyderabad, and Noida. This role is ideal for professionals with 4 to 13 years of experience who possess a strong background in quality assurance and functional testing. Candidates should be ready to work in a fast-paced environment and have a notice period of 30 days or less.
The successful candidates will be responsible for a wide range of testing activities, including manual and automated testing. Technical proficiency in Azure, SQL, and API testing using Postman is essential. Furthermore, experience with Selenium for automation and JMeter for performance testing is required. We value certifications such as ISTQB or CSM/SMCP and look for individuals who are well-versed in Agile practices to contribute effectively to our development cycles.
Key Requirements
4 to 13 years of professional experience in software testing and quality assurance.
Proven proficiency in functional testing and manual testing methodologies.
Technical experience with cloud platforms, specifically Microsoft Azure.
Strong knowledge of SQL and database management for backend testing.
Hands-on expertise in API testing using industry-standard tools like Postman.
Experience in designing and executing test automation using Selenium.
Ability to perform performance testing and analysis using JMeter.
Deep understanding of Agile methodologies and working in Scrum teams.
Professional certifications such as ISTQB, CSM, or SMCP are highly preferred.
Ability to join the organization within a notice period of 0 to 30 days.
0 Negotiable or Not Mentioned
India, Hyderabad
17 days ago
3shooltech.com
933 Views
We are seeking a highly skilled and experienced Essbase Developer to join our dynamic team and contribute to building scalable, high-performing financial reporting solutions. In this role, you will be responsible for the full lifecycle of development using Agile practices, including the design, development, testing, and debugging of Essbase-based financial reporting and analytical applications. You will work closely with business stakeholders and technical teams to ensure all deliverables meet quality, security, and compliance standards while solving complex technical challenges related to multidimensional data.
The successful candidate will manage Essbase OLAP applications (both ASO and BSO), creating outlines, load rules, and sophisticated MaxL and calculation scripts. Your responsibilities will extend to performance tuning, optimization, and supporting Smart View integrations with Excel. You will also be involved in designing star schemas, relational database modeling using Oracle or SQL Server, and automating processes through UNIX/Linux shell scripting and job automation workflows like Autosys. This position requires providing on-call support for production systems to ensure continuous operation and data integrity.
Key Requirements
At least 7 years of professional experience in application development.
Minimum of 3 years specifically in Essbase development and support.
5 to 7 years of hands-on experience with Essbase OLAP (ASO/BSO).
Proven expertise in working with financial reporting systems.
Strong proficiency in creating MaxL scripts and calculation scripts.
Experience in relational database modeling with Oracle or SQL Server.
Proficiency in UNIX/Linux shell scripting for process automation.
Familiarity with job automation tools such as Autosys.
Excellent problem-solving skills and analytical thinking.
Strong communication and collaboration skills for team environments.
0 Negotiable or Not Mentioned
India, Hyderabad
52 days ago
interaslabs.com
531 Views
Interas Labs is seeking a skilled and motivated Java Developer to join our fast-moving engineering team in Hyderabad. This is a full-time, hybrid role ideal for immediate joiners who can start within 0 to 7 days. As a member of our team, you will focus on building robust applications using Java (versions 8, 11, or 17), Spring Boot, and Microservices architecture. You will work extensively with RESTful APIs, distributed systems, and asynchronous m