0 Negotiable or Not Mentioned
India, Mumbai
16 days ago
straive.com
1248 Views
Straive is seeking a dedicated Treasury Data Scientist to join our team in Mumbai under a hybrid work arrangement. This role is specifically designed for professionals with 3 to 6 years of experience who possess a deep understanding of treasury functions such as cash management, foreign exchange (FX), liquidity, and risk assessment. The successful candidate will be responsible for leveraging advanced data science methodologies to optimize financial operations and provide actionable insights through sophisticated modeling and data visualization.
In this position, you will utilize technical expertise in Python, SQL, and various BI tools like Power BI, Qlik, or Tableau to manipulate complex datasets. You will also be expected to implement machine learning and Generative AI applications within the finance domain to stay ahead of industry trends. We are looking for an immediate joiner with a strong educational background in Data Science, Finance, or Economics who can communicate data-driven stories effectively to stakeholders. This role provides an excellent opportunity to work at the intersection of high-level finance and cutting-edge data technology.
Key Requirements
Minimum of 3 to 6 years of professional experience in data science or financial analytics.
Strong proficiency in Business Intelligence tools such as Power BI, Qlik, or Tableau.
Expertise in SQL and Python for data manipulation, modeling, and automation.
Hands-on experience with statistical modeling and machine learning techniques.
Advanced knowledge of Excel, including Power Query and VBA programming.
Prior exposure to Machine Learning and Generative AI applications in a finance context.
Solid understanding of Data Warehousing concepts and ETL (Extract, Transform, Load) processes.
In-depth knowledge of treasury functions including Cash, FX, Liquidity, and Risk management.
Bachelor’s or Master’s degree in Data Science, Finance, Economics, or a related quantitative field.
Excellent English communication skills with the ability to perform data storytelling.
0 Negotiable or Not Mentioned
India, Ahmedabad
25 days ago
cubegle.com
2181 Views
Cubegle Technologies Pvt LTD is seeking a motivated and detail-oriented individual to join our team as a Data Engineer Intern. This role is designed for candidates who are passionate about data and eager to kickstart their career in the field of data engineering. During this full-time internship, you will have the unique opportunity to gain hands-on industry experience by working on real-world data projects. You will work alongside experienced professionals in a friendly and growth-oriented environment that encourages learning and professional development. The primary focus will be on supporting data infrastructure and participating in various stages of the data lifecycle.
As an intern, you will apply your knowledge of SQL, Python, and Power BI to assist in building and maintaining data pipelines. You will gain exposure to essential data concepts such as ETL (Extract, Transform, Load), Cloud ETL, and data warehousing principles. The internship is based at our Ahmedabad office located at Link, 100 FT, RCC ROAD, near JLR Showroom, Upper, Gota. We are looking for candidates who possess strong analytical skills and a proactive approach to problem-solving. This position is an excellent stepping stone for anyone looking to build a solid foundation in the data engineering industry while contributing to meaningful projects.
Key Requirements
Basic knowledge of SQL and database management systems.
Familiarity with Power BI for data visualization and reporting.
Proficiency in Python programming for data manipulation.
Foundational understanding of data concepts including ETL and Cloud ETL.
Basic understanding of data warehousing architecture and principles.
Eagerness to learn and grow within the fast-paced data field.
Strong analytical and logical reasoning capabilities.
Excellent written and verbal communication skills.
Ability to work effectively in a collaborative team environment.
Strong attention to detail and commitment to data accuracy.
~291,666 Mentioned
India, Mumbai
11 days ago
nextjobhunt.com
607 Views
We are currently seeking a highly skilled Data Engineer to join a leading semi-government bank in Mumbai. This is a high-impact role within the financial sector, where you will be responsible for designing and building scalable batch and real-time data pipelines. You will work on enterprise-level data platforms that significantly impact credit, risk, and operations. The position is a full-time contract for an initial period of three years, with the possibility of extension based on performance and project requirements. The CTC for this role is between 35–40 LPA.
In this role, you will develop data models, marts, and feature stores for advanced analytics and reporting. You will also be tasked with implementing data quality, lineage, and governance frameworks to ensure data integrity across the organization. Security and compliance are paramount, so you must ensure all data platforms align with regulatory standards. You will provide critical support to data science and analytics teams by providing optimized datasets. Candidates should have a strong background in SQL, Spark, and Python/Java/Scala, alongside experience with cloud platforms such as AWS, Azure, or GCP.
Key Requirements
Minimum of 8 years of experience in Data Engineering or a similar role.
Strong proficiency in SQL and Data Modelling using OLTP/OLAP and Star/Snowflake schemas.
Hands-on experience with programming languages like Python, Scala, or Java.
Expertise in Big Data processing frameworks, specifically Spark.
Experience in building and managing ETL/ELT pipelines using tools like Airflow or Data Factory.
Extensive exposure to Cloud Platforms including AWS, Azure, or GCP.
Knowledge of NoSQL and Graph Databases.
Demonstrated understanding of Data Governance, Master Data Management (MDM), and Data Quality.
Prior experience working within the Banking, NBFC, or Financial Services industry.
Understanding of regulatory and compliance data requirements for the financial sector.
0 Negotiable or Not Mentioned
India, Pune
27 days ago
onspaceglobal.com
1479 Views
Onspace Global is seeking a highly skilled Oracle EPM Data Integration – Senior Analyst to join our team in Pune. This permanent role is ideal for a professional with over 6 years of experience who possesses deep expertise in Oracle EPM and Hyperion Data Integration. The successful candidate will be responsible for managing FDMEE and Data Management processes, ensuring seamless ETL operations including data loading, transformation, and reconciliation. You will also be tasked with automating and scheduling data loads, monitoring jobs, and utilizing SQL and scripting languages like Python or Groovy to optimize performance.
In addition to technical prowess, the Senior Analyst must have a solid understanding of core finance processes such as planning, consolidation, and reporting. Hands-on experience with Oracle EPM modules, including ARCS, is essential. We are looking for an immediate joiner who can hit the ground running and contribute to our EPM Cloud API integration efforts. If you are a proactive problem-solver with a background in financial data quality management, we encourage you to apply and help us streamline our enterprise data workflows. This position offers an opportunity to work on complex cloud-based financial systems in a fast-paced environment.
Key Requirements
Expertise in Oracle EPM / Hyperion Data Integration.
Extensive experience in FDMEE (Financial Data Quality Management Enterprise Edition) and Data Management.
Strong knowledge of ETL processes including data load, transformation, and reconciliation.
Experience in automating and scheduling data loads and job monitoring.
Strong SQL skills with scripting experience in Python, Groovy, or Shell.
Detailed knowledge of EPM Cloud APIs for integration purposes.
Solid understanding of Finance Processes including Planning, Consolidation, and Reporting.
Hands-on experience working with Oracle EPM modules such as Planning and Consolidation.
Proficiency in ARCS (Account Reconciliation Cloud Service).
Must be an Immediate Joiner available to start right away.
Minimum of 6 years of relevant professional experience in data integration.
Ability to troubleshoot complex data reconciliation issues independently.
~291,666 Mentioned
India, Mumbai
11 days ago
nextjobhunt.com
701 Views
We are seeking a highly experienced Data Engineer for a high-impact role with a leading financial institution (semi-government Bank) located in BKC, Mumbai. In this position, you will be responsible for designing and building scalable batch and real-time data pipelines, as well as developing data models, marts, and feature stores for advanced analytics and reporting. You will also play a critical role in implementing data quality, lineage, and governance frameworks to ensure data security and compliance with regulatory standards. The role is a full-time contract for an initial three-year term, which is extendable. The salary offered for this position is 35–40 LPA.
As a core member of the data team, you will support data science and analytics units with optimized datasets that impact credit, risk, and banking operations. You will work on enterprise-level data platforms using modern technologies such as Spark, Python, and various cloud platforms. This is a unique opportunity for a professional with over 8 years of experience to work in a high-stakes environment within the financial services sector, contributing to the development of robust data infrastructures that drive business decisions.
Key Requirements
Minimum of 8 years of professional experience in Data Engineering or related roles.
Strong proficiency in SQL and Data Modelling including OLTP/OLAP and Star/Snowflake schemas.
Hands-on experience with programming languages such as Python, Scala, or Java combined with Spark.
Proven experience working with ETL/ELT tools like Airflow or Azure Data Factory.
Significant exposure to major Cloud Platforms including AWS, Azure, or GCP.
In-depth knowledge of NoSQL and Graph Databases for varied data storage needs.
Solid understanding of Data Governance, Master Data Management (MDM), and Data Quality standards.
Previous professional experience in Banking, NBFC, Financial Services, or Fintech industries.
Deep understanding of regulatory and compliance data requirements specific to the financial sector.
Relevant certifications in Cloud Computing or Data Engineering are highly preferred.
Ability to build and maintain scalable real-time and batch data pipelines.
Strong communication skills to collaborate with data science and operations teams.
0 Negotiable or Not Mentioned
India, Pune
10 days ago
lotechpro.in
710 Views
We are seeking a highly skilled Technical Lead to join our engineering team in a hybrid role based in Pune, India. In this capacity, you will provide technical leadership and oversight for complex software projects, ensuring the delivery of scalable microservices and robust distributed systems. Your role will involve mentoring junior developers, making architectural decisions, and working closely with stakeholders to align technical strategies with business goals. You will be instrumental in leveraging Google Cloud Platform services to build high-performance data-driven applications.
The ideal candidate will have extensive experience with Go and Python programming, alongside a deep proficiency in BigQuery and SQL for data modeling and query optimization. You will be responsible for analyzing large datasets to derive actionable insights and implementing modern DevOps practices, including CI/CD and containerization with Docker and Kubernetes. We are looking for a proactive problem-solver who can lead by example and maintain high standards of code quality and performance across our technology stack. This position offers an exciting opportunity to work at the forefront of cloud technology in a collaborative and innovative environment.
Key Requirements
6+ years of software engineering experience.
2+ years of experience in a Technical Leadership role.
Strong hands-on experience with Google Cloud Platform (GCP) including Compute Engine and Pub/Sub.
Proficiency in BigQuery and SQL with strong data analysis and data modeling skills.
Strong programming expertise in Go (Golang) and Python.
Strong understanding of relational and NoSQL databases like PostgreSQL, MySQL, and Spanner.
Experience building scalable microservices and distributed systems.
Solid understanding of CI/CD, Git, and modern DevOps practices.
Experience with Kubernetes, Docker, or serverless technologies on GCP.
Exposure to ETL/ELT frameworks, Dataflow, Dataproc, or Airflow.
0 Negotiable or Not Mentioned
India, Pune
10 days ago
techedge-solution.com
688 Views
Techedge Solution is seeking a highly experienced professional for the position of Python Lead based in Pune. This role is designed for a technical leader with at least 9 years of experience in software development, particularly focusing on Python and the Django framework. The ideal candidate will be responsible for overseeing backend development, integrating AI functionalities, and collaborating with data science teams to deliver robust and scalable solutions. As a lead, you will provide technical guidance, perform code reviews, and ensure that the architectural designs meet the high standards of the organization while keeping up with the latest industry trends in Artificial Intelligence and Machine Learning.
Candidates applying for this role must be currently serving their notice period and able to join within 15 days or less. This position offers a unique opportunity to work at the intersection of backend engineering and data science, making it a perfect fit for someone passionate about Backend AI. You will be working in a fast-paced environment where your expertise in Python and data-driven systems will directly impact the company's growth. The role is based in Pune, India, and requires a candidate who is ready to take immediate responsibility for critical project components.
Key Requirements
Minimum of 9 years of professional experience in software development.
Advanced proficiency in Python programming language.
Extensive experience with the Django web framework for backend systems.
Proven track record in Backend AI development and implementation.
Strong understanding of Data Science concepts and methodologies.
Must be a serving candidate with a notice period of immediate to 15 days.
Solid experience in Machine Learning and Deep Learning architectures.
Ability to lead and mentor a team of developers effectively.
Expertise in designing and maintaining scalable backend infrastructure.
Strong communication skills for effective team collaboration.
Familiarity with API development and database management.
Degree in Computer Science, Information Technology, or a related field.
0 Negotiable or Not Mentioned
India, Ahmedabad
15 days ago
shreeyaansolusmart.com
1189 Views
Join a progressive, data-driven organization as a Data Analyst in Ahmedabad. This on-site role offers an exciting opportunity to work with diverse data sources and transform raw data into actionable insights that drive strategic decisions. You will be responsible for analyzing data from SAP, Excel, and Power BI, building impactful dashboards, and developing automated reporting systems to support decision-making processes across the organization. You will collaborate closely with cross-functional teams in Finance, Operations, and Supply Chain to ensure data accuracy and relevance.
The role requires working Monday to Friday on a UK shift schedule, demanding a high level of independence and strong English communication skills. You will be expected to clean, transform, and manage large datasets while maintaining high standards of data integrity. If you are passionate about turning data into powerful business insights and have a strong background in analytical tools like Power BI and SAP, this is the perfect career opportunity for you to drive significant business impact within a professional environment.
Key Requirements
Minimum of 3 years of professional experience as a Data Analyst.
Advanced proficiency in Microsoft Excel, including PivotTables, VLOOKUP, and Power Query.
Strong experience with Power BI for creating dashboards and data modeling.
Familiarity with SAP reporting and data extraction processes.
Proven ability to clean, transform, and manage large and complex datasets.
Strong analytical and problem-solving skills to derive actionable insights.
Excellent communication skills for collaborating with cross-functional teams.
Ability to manage multiple tasks independently in a fast-paced environment.
Flexibility to work in a UK shift schedule from Monday to Friday.
Fluency in English for both written and verbal professional communication.
0 Negotiable or Not Mentioned
India, Pune
19 days ago
techedge-solution.com
985 Views
Techedge Solution is currently seeking a visionary and experienced Gen AI Lead to join our growing team in Pune. This role is designed for a technical expert who possesses over eight years of professional experience in the field of software development and artificial intelligence. As the lead for our Generative AI initiatives, you will be responsible for spearheading the design and deployment of sophisticated AI models that leverage Large Language Models (LLM) and Retrieval-Augmented Generation (RAG) frameworks to drive innovation and efficiency.
In this leadership capacity, you will not only be involved in high-level architectural decisions but also remain hands-on with Python development. You will guide a team of talented engineers, fostering a culture of technical excellence and continuous learning. Your deep understanding of the AI landscape will be crucial in identifying new opportunities for applying Generative AI across various business domains. If you are a proactive leader with a passion for cutting-edge technology and a desire to make a significant impact, we invite you to apply.
Key Requirements
Minimum of 8 years of professional experience in software engineering or AI roles.
Proven expertise in Python programming and related libraries.
Deep hands-on experience with Large Language Models (LLM).
Demonstrated experience implementing Retrieval-Augmented Generation (RAG).
Strong leadership skills with the ability to manage and mentor a technical team.
In-depth knowledge of Generative AI principles and practical applications.
Ability to architect complex AI solutions from the ground up.
Experience with cloud platforms such as AWS, Azure, or GCP for AI services.
Familiarity with vector databases and semantic search technologies.
Excellent communication skills and the ability to present technical concepts to stakeholders.
0 Negotiable or Not Mentioned
India, Pune (Baner)
19 days ago
ideastoimpacts.com
1020 Views
Ideas to Impacts Digital is seeking a Software Test Engineer with specialized exposure to AI systems to contribute to the validation of next-generation AI applications. This role is central to evaluating AI-powered systems developed using Large Language Models, Retrieval-Augmented Generation, and Agentic AI workflows. The position is based in Pune (Baner) and follows a hybrid work model requiring three days of onsite presence per week. Candidates will join a team dedicated to ensuring the reliability and accuracy of advanced artificial intelligence implementations. Responsibilities include evaluating AI generated responses for accuracy, relevance, and consistency while checking for potential hallucinations. The successful candidate will validate RAG pipelines and document retrieval grounding, alongside testing multi-step reasoning within agent-based AI workflows. The role involves designing structured AI test scenarios, writing Python scripts for system evaluation, and working closely with AI engineers to debug and optimize system behavior. Strong analytical skills and a background in Python and API testing are essential for this technical position.
Key Requirements
Strong proficiency in Python programming for script development.
Direct exposure to LLM-based applications or AI chatbot systems.
Deep understanding of RAG (Retrieval-Augmented Generation) pipelines.
Experience in test case design specifically for AI and API testing.
Ability to evaluate AI responses for accuracy, relevance, and hallucinations.
Skill in validating document retrieval grounding within AI systems.
Experience testing agent-based AI workflows and multi-step reasoning.
Capability to design structured AI test scenarios and complex queries.
Proven experience writing Python scripts for AI evaluation and testing.
Ability to work closely with AI engineers to identify and debug system behavior.
Understanding of hybrid software development and testing cycles.
Knowledge of software quality assurance best practices and methodologies.