0 Negotiable or Not Mentioned
India, Aurangabad
13 days ago
Svatantra.adityabirla.com
934 Views
Svatantra Microfin Pvt Ltd is seeking a dedicated Audit Associate to join its professional team in Aurangabad. As one of India’s leading microfinance companies, we provide a robust platform for individuals looking to build a strong career in Audit. You will be part of a dynamic and growth-oriented team, working to ensure compliance and operational excellence within the organization. This role is ideal for young professionals eager to gain hands-on experience in the microfinance sector and contribute to financial inclusion efforts across the region.
In this role, you will be responsible for conducting internal audits, verifying financial transactions, and ensuring adherence to company policies and regulatory frameworks. The position involves assessing risk management processes and suggesting improvements to internal controls. Successful candidates will enjoy a professional work environment that fosters learning and career progression. Your contributions will help maintain the integrity of our financial operations as we continue to expand our reach and impact in the microfinance industry.
Key Requirements
Graduation in any stream from a recognized university.
Age must be between 20 to 28 years.
Basic knowledge of audit principles and microfinance operations.
Strong analytical and logical reasoning skills.
Excellent written and verbal communication skills in English and local languages.
Proficiency in Microsoft Office, particularly Excel and Word.
High level of integrity and professional ethics.
Ability to travel to various locations as required for audit assignments.
Detail-oriented with a focus on accuracy in financial reporting.
Ability to work effectively in a team-oriented and fast-paced environment.
0 Negotiable or Not Mentioned
India, Hyderabad
18 days ago
screativesoft.com
1346 Views
Screatives Software Services Pvt. Ltd. is currently seeking motivated freshers for the position of Process Associate at our Hyderabad office located in Mindspace. This is a full-time role specifically for the night shift, operating from 7 PM to 4 AM. Candidates will be responsible for various process-oriented tasks, ensuring high quality and accuracy in their daily output within a fast-paced environment.
As a fresher, you will receive training to handle specialized business processes and software tools. The ideal candidate should be prepared for a collaborative workspace and demonstrate a strong commitment to professional growth. Located in the tech hub of Building No. 9, Mindspace, this role offers an excellent starting point for those looking to enter the IT services industry.
Key Requirements
Basic computer literacy.
Excellent written and verbal English communication.
Ability to work night shifts (7 PM - 4 AM).
Strong attention to detail.
Analytical thinking skills.
High school diploma or equivalent.
Willingness to learn new software.
Time management skills.
Ability to work in a team.
Data entry speed and accuracy.
0 Negotiable or Not Mentioned
India, Hyderabad
18 days ago
3shooltech.com
936 Views
We are seeking a highly skilled and experienced Essbase Developer to join our dynamic team and contribute to building scalable, high-performing financial reporting solutions. In this role, you will be responsible for the full lifecycle of development using Agile practices, including the design, development, testing, and debugging of Essbase-based financial reporting and analytical applications. You will work closely with business stakeholders and technical teams to ensure all deliverables meet quality, security, and compliance standards while solving complex technical challenges related to multidimensional data.
The successful candidate will manage Essbase OLAP applications (both ASO and BSO), creating outlines, load rules, and sophisticated MaxL and calculation scripts. Your responsibilities will extend to performance tuning, optimization, and supporting Smart View integrations with Excel. You will also be involved in designing star schemas, relational database modeling using Oracle or SQL Server, and automating processes through UNIX/Linux shell scripting and job automation workflows like Autosys. This position requires providing on-call support for production systems to ensure continuous operation and data integrity.
Key Requirements
At least 7 years of professional experience in application development.
Minimum of 3 years specifically in Essbase development and support.
5 to 7 years of hands-on experience with Essbase OLAP (ASO/BSO).
Proven expertise in working with financial reporting systems.
Strong proficiency in creating MaxL scripts and calculation scripts.
Experience in relational database modeling with Oracle or SQL Server.
Proficiency in UNIX/Linux shell scripting for process automation.
Familiarity with job automation tools such as Autosys.
Excellent problem-solving skills and analytical thinking.
Strong communication and collaboration skills for team environments.
0 Negotiable or Not Mentioned
India
9 hours ago
emperentech.com
80 Views
Emperen Technologies is seeking elite Databricks talent to join our global team of experts. As an Official Databricks Partner, we specialize in helping enterprises scale their data transformation initiatives faster, smarter, and more cost-efficiently. We are looking for professionals who can hit the ground running on a contract or hourly basis to meet urgent delivery needs for our diverse portfolio of enterprise clients.
Candidates will be responsible for leveraging Azure Databricks, Spark, and PySpark to build robust data pipelines and architectures. The role involves deep involvement in data migration, modernization, and the integration of AI/ML models into existing business analytics frameworks. You will work closely with Data Engineers and Architects to enable outcomes that drive business value. If you possess deep technical capability and a proven track record in data initiatives, we encourage you to apply.
Key Requirements
Proficiency in Azure Databricks and Apache Spark ecosystems.
Strong experience with PySpark for large-scale data processing.
Solid background in Data Engineering and Data Architecture principles.
Expertise in Data Migration and Modernization of legacy systems.
Ability to integrate AI/ML and Analytics into production data pipelines.
Available to work on a Contract and Hourly Basis for urgent delivery.
Strong communication skills for collaborating with CTOs and Head of Data roles.
Experience with cloud infrastructure and security best practices.
Proven ability to deliver high-quality talent outcomes in fast-paced environments.
Knowledge of Spark optimization and performance tuning techniques.
0 Negotiable or Not Mentioned
India, Remote
18 days ago
idtsolution.in
1775 Views
We are looking for a skilled and passionate Data Engineer to join our advanced analytics team. The ideal candidate will have hands-on experience in IICS, Snowflake, SQL, and Cloud platforms, with a strong foundation in building scalable data pipelines and modern data integration frameworks. You will play a key role in developing data-driven solutions to support AI/ML initiatives and enhance customer experience across global operations. The role requires working from 12:00 PM to 9:00 PM IST to align with team requirements.
Your core responsibilities will include designing ETL/ELT processes and managing cloud-based data solutions on platforms like AWS, Azure, or GCP. You will handle both structured and semi-structured data while implementing essential data quality and monitoring processes. Collaboration with cross-functional teams is vital to support business growth initiatives. The package for this role is up to ₹14 LPA, reflecting the expertise required to handle complex data architectures and DevOps practices within a modern data stack environment.
Key Requirements
2+ years of experience in Data Engineering.
Strong hands-on experience with Snowflake.
Proficiency in Cloud platforms (AWS, Azure, or GCP).
Expert knowledge of SQL and Python.
Extensive experience with Informatica Cloud (IICS) and ETL tools.
Proven ability to build scalable data pipelines and cloud-based data solutions.
Knowledge of serverless architectures and APIs.
Familiarity with DevOps practices including CI/CD and IaC.
Experience with streaming tools like Kafka or Spark Streaming.
Bachelor’s degree in a relevant field such as Computer Science or Engineering.
0 Negotiable or Not Mentioned
India
16 days ago
nishtechnologies.com
939 Views
Nish Technologies is seeking a skilled Data Engineer to join a prestigious Big4 MNC client on a full-time basis. The ideal candidate will have between 5 and 8 years of professional experience in data engineering and will be responsible for designing and implementing efficient data solutions. The primary work locations for this role are Hyderabad and Bangalore. This position requires a strong technical background and the ability to work in a fast-paced, high-impact environment.
Candidates must be proficient in Python, SQL, and PySpark to handle complex data sets and pipelines. We are conducting a virtual weekend recruitment drive on Saturday, April 4th. This role is intended for immediate joiners or those with a notice period of up to 15 days. Interested professionals are encouraged to share their profiles for consideration in this expedited hiring process.
Key Requirements
5 to 8 years of relevant experience in Data Engineering.
Advanced proficiency in Python programming for data processing.
Expertise in writing complex SQL queries for database management.
In-depth knowledge of PySpark and its application in big data projects.
Ability to join immediately or within a maximum notice period of 15 days.
Experience working within a Big4 MNC or similar large-scale environment.
Strong analytical skills to solve complex data-related problems.
Familiarity with ETL processes and data pipeline orchestration.
Excellent communication skills for cross-functional team collaboration.
Availability to participate in the virtual weekend recruitment drive on April 4th.
0 Negotiable or Not Mentioned
India
20 days ago
pyxidiatech.com
1240 Views
We are seeking a highly skilled Senior Data Analyst to join our dynamic team and play a pivotal role in designing, building, and scaling innovative data solutions across our products and client implementations. The successful candidate will be responsible for developing scalable data pipelines, optimizing ETL workflows, and ensuring the highest standards of data quality and reliability. You will work closely with cross-functional teams, including Product, Engineering, and Data Science, to drive data architecture decisions and deliver actionable insights that solve complex business challenges. This role offers the opportunity to provide technical guidance and mentorship to junior analysts while working on impactful, data-driven projects. This position is available in multiple locations across India, specifically Mumbai, Bangalore, and Pune. The ideal candidate will have over four years of experience in data-focused roles and a deep understanding of the AWS ecosystem, including Redshift, Athena, and EMR. Experience with US Healthcare Data is considered a significant advantage. Candidates must be proficient in Python, PySpark, and SQL, and possess a strong grasp of data modeling and performance optimization. If you are passionate about big data and looking to make a significant impact in a collaborative environment, we encourage you to apply.
Key Requirements
Minimum of 4 years of professional experience in Data Analytics or Data Engineering.
Demonstrated expertise in SQL and both relational and NoSQL database management.
Hands-on proficiency with Python and PySpark for processing large-scale datasets.
Proven experience in building and optimizing ETL/ELT pipelines using Airflow or AWS Glue.
Strong conceptual understanding of data modeling and performance tuning.
Advanced technical knowledge of AWS data services including S3, EMR, Redshift, and Athena.
Ability to design and manage complex data architectures across structured and unstructured sources.
Competency in maintaining high standards for data quality, validation, and monitoring.
Strategic thinking skills to perform analysis and generate actionable business insights.
Strong collaborative skills to work effectively with Product, Engineering, and Client teams.
0 Negotiable or Not Mentioned
India
20 days ago
se-mentor.com
1099 Views
We are looking for an experienced Databricks Engineer to enhance our data engineering capabilities at SE Mentor Solutions. In this role, you will be leveraging Azure services, PySpark, and Databricks to build high-performance data frameworks. Your contributions will help drive our data-driven decision-making processes by ensuring the reliability and scalability of our cloud-based data systems. This position offers the flexibility to work from either Cochin or Bengaluru.
Candidates should possess at least 5 years of experience in the data engineering field with a focus on Azure technologies. You will be responsible for designing and implementing efficient data processing logic and managing complex data architectures within the Azure cloud environment. As part of our team, you will participate in the full lifecycle of data projects, from requirements gathering to deployment and monitoring. Salary information was not provided in the original job post.
Key Requirements
Over 5 years of experience in data engineering or related technical roles.
Strong expertise in SQL and PySpark for data processing tasks.
Hands-on experience with Azure Databricks and Azure Data Factory.
Familiarity with Azure Cloud infrastructure and related services.
Experience in building and optimizing large-scale data architectures.
Ability to design and implement automated data pipelines.
Understanding of data warehousing concepts and technologies.
Strong troubleshooting skills for cloud-based data environments.
Effective teamwork and communication skills.
Commitment to maintaining high standards of data security and integrity.
0 Negotiable or Not Mentioned
India
20 days ago
se-mentor.com
1048 Views
SE Mentor Solutions is seeking a skilled ETL Developer to join our technical team. This role is responsible for designing, developing, and maintaining robust ETL processes to support our data integration needs. You will work closely with data architects and analysts to ensure data quality and system efficiency across various platforms. Possible work locations for this role include Cochin and Bengaluru.
The ideal candidate will have over 4 years of experience with a strong background in Advanced SQL and various ETL tools. You will be expected to optimize database performance, troubleshoot complex data issues, and contribute to the continuous improvement of our data infrastructure. This is an excellent opportunity to work on scalable solutions in a high-growth environment focused on technological innovation and data excellence. No salary was mentioned in the original posting.
Key Requirements
Minimum of 4 years of professional experience in ETL development.
Proficiency in Advanced SQL for complex query writing and optimization.
Experience with data modeling and database design principles.
Proven ability to build and maintain scalable data pipelines.
Strong analytical and problem-solving skills for troubleshooting data issues.
Knowledge of performance tuning for ETL processes and database systems.
Familiarity with version control systems such as Git.
Excellent communication skills for collaborating with cross-functional teams.
Ability to document technical processes and data workflows clearly.
Experience working in an Agile development environment.
~140,000 Mentioned
India, Remote
7 days ago
talentquell.com
615 Views
We are seeking a highly skilled MLOps Engineer to spearhead our cross-cloud machine learning model migration initiatives, specifically moving from GCP to Azure Databricks. The successful candidate will be responsible for building and optimizing production-grade MLOps workflows and CI/CD pipelines while implementing MLflow for meticulous model tracking and lifecycle management. You will develop scalable pipelines using Databricks and PySpark, ensuring seamless data movement and high model reliability. The budget for this position is ₹1.4 LPM.
In this role, you will also focus on performance and cost optimization of machine learning infrastructures. You will work closely with Data Science teams to enable efficient model deployment and monitoring across cloud environments. This position operates on a UK time shift (8 AM – 5 PM) and offers a remote working arrangement within India. Candidates should have a strong foundation in data engineering and pipeline orchestration to succeed in this dynamic environment.
Key Requirements
Minimum 6–8 years of professional experience in MLOps or a related technical role.
Strong expertise in Databricks and PySpark for large-scale data processing.
Hands-on experience with MLflow for tracking experiments and managing model lifecycles.
Proven proficiency in CI/CD practices and workflows using tools like GitHub Actions.
Extensive experience working with cloud platforms, specifically Microsoft Azure and GCP.
Demonstrated ability to perform cross-cloud data movement and model migration tasks.
In-depth knowledge of model deployment strategies and continuous monitoring systems.
Strong background in data engineering and the orchestration of complex pipelines.
Excellent communication and collaboration skills for working with Data Science teams.
Availability to work according to the UK shift timings from 8 AM to 5 PM.
0 Negotiable or Not Mentioned
India, Remote
6 days ago
tekvo.io
587 Views
Tekvo is looking for a seasoned and highly motivated Azure Technical Lead to spearhead our cloud data engineering projects. In this critical role, you will be the driving force behind large-scale Azure analytics initiatives, overseeing the end-to-end development of high-impact data platforms. You will be responsible for defining the solution design, ensuring technical excellence across the delivery lifecycle, and providing strategic guidance to engineering teams. Your expertise will directly contribute to the creation of scalable, robust, and efficient data architectures that empower our clients to make data-driven decisions.
As an Azure Technical Lead, you must demonstrate mastery over core Azure data services such as Data Factory, Synapse, and Databricks. The role demands a blend of deep technical proficiency in SQL and Python along with the leadership skills required to mentor developers and manage complex stakeholder expectations. This remote position offers a unique opportunity for professionals based in India to work on cutting-edge cloud technologies within a collaborative environment. Successful candidates will be expected to maintain high standards of code quality and architectural integrity while driving innovation in the data engineering space.
Key Requirements
Possess 10-14 years of professional experience in data engineering and cloud platforms.
Demonstrate expert-level proficiency in designing and implementing Azure Data Factory pipelines.
Have hands-on experience with Azure Synapse Analytics for enterprise data warehousing.
Show strong technical expertise in using Azure Databricks for big data processing.
Maintain advanced knowledge of SQL for complex data manipulation and performance tuning.
Exhibit proficiency in Python programming for automating data workflows and engineering tasks.
Prove a track record of leading and delivering large-scale analytics initiatives on Azure.
Possess strong solution design skills with the ability to create scalable data architectures.
Demonstrate the ability to guide, mentor, and manage high-performing technical teams.
Experience in cloud security best practices and data governance frameworks is highly preferred.
Excellent communication skills to interact with stakeholders and translate business needs into technical solutions.
0 Negotiable or Not Mentioned
India, Hyderabad
17 days ago
xautomations.com
1006 Views
xautomations is looking for a part-time Data Modeler to join our team in Hyderabad. This role is focused on designing and maintaining efficient data structures that support our real-time systems and data pipelines. You will work on creating models that optimize data storage and retrieval for high-performance applications, ensuring data integrity and consistency across various platforms.
The position offers flexibility as a part-time role while providing the opportunity to work on complex, real-world data challenges within a professional engineering environment. You will collaborate with our engineering team to ensure that our data architecture is scalable and aligned with evolving business requirements. This is an office-based role located in Hyderabad.
Key Requirements
Strong background in data modeling techniques and methodologies.
Experience with relational and non-relational database design.
High proficiency in SQL for data manipulation and querying.
Understanding of data pipeline architectures and ETL processes.
Ability to create both logical and physical data models.
Knowledge of data warehousing concepts and star/snowflake schemas.
Experience using professional data modeling software and tools.
Collaborative mindset for working with data engineers and scientists.
Strong attention to detail regarding data governance and quality.
Effective communication skills to explain data structures to stakeholders.
0 Negotiable or Not Mentioned
India, Hyderabad
19 days ago
endeavourtechnologies.co.in
1252 Views
Endeavour Technologies is looking for a skilled Test Engineer to join the team in Hyderabad. This role is pivotal in maintaining the quality standards of both web and mobile applications. The successful candidate will be responsible for analyzing requirements, drafting comprehensive test plans, and executing manual and automated test cases. You will be tasked with identifying, reporting, and tracking defects using standard management tools to ensure a seamless user experience and product reliability.
Beyond execution, you will play a key role in developing and maintaining automation test scripts and improving existing test frameworks. Collaboration is central to this position, as you will work alongside developers and product managers to resolve issues and refine technical documentation. This role requires a proactive individual with strong analytical skills and a deep understanding of testing life cycles, ready to contribute effectively to a fast-paced development environment in an office-based setting.
Key Requirements
2-3 years of hands-on experience in software quality assurance.
Strong proficiency in Manual and Automation Testing techniques.
Solid understanding of Software Testing Life Cycle (STLC).
Comprehensive knowledge of Software Development Life Cycle (SDLC).
Practical experience with automation tools like Selenium or TestNG.
Scripting proficiency in programming languages such as Java or JavaScript.
Demonstrated experience in API testing using tools like Postman.
Familiarity with defect management and tracking tools like Jira.
Excellent analytical thinking and technical problem-solving abilities.
Capacity to work full-time from the Hyderabad office.
0 Negotiable or Not Mentioned
India, Hyderabad
17 days ago
xautomations.com
1094 Views
We are currently seeking a Performance Test Engineer to join our Hyderabad-based team. Your primary focus will be ensuring that our real-time systems and high-performance platforms can handle scale and maintain stability under various load conditions. You will design and execute performance test plans that identify critical performance issues, latency, and throughput bottlenecks before deployment.
This role requires a proactive approach to finding and resolving system limitations. By working closely with our development and infrastructure teams, you will help optimize our systems for maximum efficiency and reliability. This is a full-time position located in our Hyderabad office, ideal for candidates who enjoy deep-diving into system metrics and ensuring software quality at scale.
Key Requirements
Specialized experience in performance, load, and stress testing.
Proficiency with testing tools such as JMeter, Locust, or Gatling.
Ability to analyze system bottlenecks, latency, and resource usage.
Experience testing real-time systems and data-intensive platforms.
Understanding of scalability principles and high-performance computing.
Scripting skills for creating automated performance test suites.
Collaboration skills for working with developers on system tuning.
Ability to generate and present detailed performance analysis reports.
Experience monitoring system resources during active stress tests.
Familiarity with CI/CD integration for automated performance gates.
0 Negotiable or Not Mentioned
India, Remote
20 days ago
e-solutionsinc.com
1491 Views
As an AI Quality Analyst, you will evaluate a new personalization feature for Gemini. You will assess how well the model uses information from your past Gemini conversations, Gmail, Google Search, and YouTube activity to make responses more relevant and helpful. This role requires a unique blend of creativity and analytical rigor. You will actively design prompts from the perspective of your own personal experiences. You will then use your analytical skills to assess the quality of the model's personalized responses, evaluating dimensions like Grounding, Integration, and Helpfulness.
You will work as part of a multilingual team focused on languages such as Italian, German, French, Polish, Dutch, Bulgarian, Danish, Finnish, Greek, Norwegian, Romanian, and Swedish. This is a short-term contract lasting one month and requires a four-hour overlap with the PST time zone. Candidates must be comfortable using their primary personal Google account to facilitate a genuine assessment of the personalization features. This is a fully remote position available to candidates in India and several other eligible countries.
Key Requirements
Language Proficiency in one of the focus languages (Italian, German, French, etc.).
Ability to read and write in the focus language with a high degree of complexity.
Willingness to use a primary personal Google account for testing purposes.
Exceptional analytical thinking to evaluate nuanced and ambiguous AI responses.
Experience in creative prompt engineering and designing multi-turn prompts.
Superior written communication skills for writing clear and structured rationales.
Ability to provide constructive feedback and detailed annotations.
Functional desktop or laptop setup with a stable internet connection.
Ability to work a schedule with a 4-hour overlap with PST time zone.
Commitment to a 1-month short-term contract duration.
Knowledge of Google ecosystem services like Gmail, Search, and YouTube.
0 Negotiable or Not Mentioned
India, Remote / Hybrid
11 days ago
infogine.com
655 Views
Join a forward-thinking team dedicated to developing cutting-edge LLM-based solutions, autonomous AI agents, and RAG pipelines that aim to redefine the future of enterprise AI. As a Gen AI / LLM Specialist, you will be at the forefront of technical innovation, applying advanced AI techniques to solve complex business problems. This role requires a balance of research-oriented thinking and practical application to build robust, scalable AI systems. Possible work locations include remote or hybrid arrangements within India.
The ideal candidate will demonstrate technical leadership in fine-tuning large language models and optimizing prompt engineering workflows. You will collaborate with cross-functional teams to integrate Vector Databases and API frameworks into production-ready environments. A significant portion of the role involves ensuring Responsible AI practices and Agent Safety, maintaining high standards for the ethical deployment of autonomous systems. This position is designed for experienced professionals with a solid foundation in Python, NLP, and Deep Learning who are ready to take on a general shift role in a high-impact environment.
Key Requirements
6–10 years of professional experience in software development or AI research.
Demonstrated hands-on expertise with LLM Fine-Tuning and Prompt Engineering.
Advanced proficiency in Python programming for data science and AI applications.
Strong background in Natural Language Processing (NLP) and Deep Learning methodologies.
Proven experience building and optimizing Retrieval-Augmented Generation (RAG) pipelines.
Practical knowledge of Vector Databases such as Pinecone, Milvus, or Weaviate.
Experience designing and implementing robust API Frameworks for AI model integration.
Deep understanding of Responsible AI principles and safety protocols for autonomous agents.
Ability to work effectively in a general shift schedule within a hybrid or remote setup.
Strong problem-solving skills and the ability to translate business requirements into technical AI solutions.
0 Negotiable or Not Mentioned
India
6 days ago
lovasit.com
470 Views
Lovas IT is seeking a highly skilled Cloud Leader – Agentic AI to architect and lead next-generation enterprise AI platforms built on multi-agent systems. This is an architect-level role requiring significant experience in building and deploying agent-based systems in production environments. The successful candidate will own the end-to-end architecture of Agentic AI systems powering enterprise use cases such as policy lifecycle, underwriting, and claims. This position is available across multiple locations in India, specifically Mumbai, Bangalore, Mangaluru, and Udupi.
The role involves defining agent orchestration, memory management, tool usage, and guardrails while optimizing systems for latency, cost, and reliability. You will collaborate with product, engineering, and domain teams to establish evaluation frameworks and AI safety controls. Ideal candidates should have deep expertise in the Google ADK and the GCP ecosystem, specifically Vertex AI. You must be capable of handling real-world challenges beyond theoretical models, ensuring safe, scalable, and cost-efficient AI deployments.
Key Requirements
Minimum of 7.5 to 12 years of experience in architecture focusing on AI/ML and distributed systems.
Hands-on experience with Large Language Models (LLMs) including Prompting, RAG, Fine-tuning, and Evaluation.
Proven experience in agent-based or LLM-driven system design within a production environment.
Strong knowledge of microservices and event-driven architectures.
Demonstrated expertise in Agentic AI Architecture as a core technical skill.
Ability to architect multi-agent systems using the Google Agent Development Kit (ADK).
Experience defining agent orchestration, memory management, tool usage, and AI guardrails.
Skills in designing LLM + API + workflow + event-driven architectures.
Capability to optimize systems for latency, cost, reliability, and observability.
Experience establishing evaluation frameworks, prompt strategies, and AI safety controls.