0 Negotiable or Not Mentioned
India, Bengaluru
28 days ago
primusglobal.com
2270 Views
Primus Global is seeking a passionate and experienced BI Developer to join our dynamic team in Bengaluru. In this role, you will be responsible for designing and delivering intuitive, interactive dashboards and scorecards that translate complex business requirements into actionable data stories. You will collaborate closely with Business Analysts and Analytics teams to prototype and implement innovative solutions that empower business decisions through data-driven insights. This position is ideal for someone who thrives on turning raw data into compelling visual stories and is comfortable working in a fast-paced environment.
The successful candidate will focus on optimizing data models and dashboard performance for large-scale datasets, ensuring data accuracy through rigorous validation and documentation. You will contribute to BI architecture decisions and implement role-based security to protect sensitive information. Beyond technical execution, you will drive user adoption through storytelling, training, and sharing best practices within the organization. With a focus on enterprise-scale BI implementations, you will leverage modern tools and cloud platforms to deliver high-impact results for global teams.
Key Requirements
Bachelor’s degree in Computer Science, IT, Engineering, Mathematics, or Statistics.
3+ years of experience with BI tools such as Power BI, Tableau, or Qlik Sense.
2+ years of strong SQL experience for data extraction and transformation.
Experience integrating multiple data sources into visualization platforms.
Ability to design and deliver intuitive, interactive dashboards and scorecards for business KPIs.
Proven track record in translating business requirements into visualization-driven insights.
Expertise in optimizing data models and dashboard performance for large-scale datasets.
Proficiency in data validation, testing, and documentation to ensure data accuracy.
Knowledge of BI architecture decisions and implementation of role-based security.
Experience with cloud platforms like Microsoft Fabric or Azure ecosystem.
0 Negotiable or Not Mentioned
India, Bangalore
3 days ago
yeswayconsultancy.in
243 Views
Yesway Consultancy is seeking a high-impact Business Analyst specializing in Analytics and Business Intelligence to join the team in Bangalore. This role is designed for a professional with excellent communication skills and a strong ability to manage stakeholders, confidently engaging with senior leadership to drive analytics initiatives from conception to completion. The ideal candidate will be a proactive problem-solver who can lead business discussions, identify key analytics use cases, and translate business needs into technical problem statements and actionable solutions.
The successful candidate will be responsible for the end-to-end requirements of analytics projects, including the definition of enterprise KPIs, metrics, and performance frameworks. Technical proficiency in BI platforms like Power BI or Tableau, combined with deep exposure to the SAP Analytics ecosystem (BW, HANA, S/4), is essential. You will also be tasked with creating detailed documentation such as functional specifications, KPI logic, and data mappings, while ensuring data integrity through rigorous dashboard and data validation techniques. Immediate joiners or those with a notice period of up to 15 days are preferred.
Key Requirements
Proven experience as a Business Analyst specifically within Analytics, BI, or Data environments.
Demonstrated experience leading complex, cross-functional analytics initiatives.
Strong domain expertise in at least one area: Supply Chain, Finance, Sales, Manufacturing, or Operations.
Extensive stakeholder management experience, particularly with senior business leaders.
Hands-on experience with BI and Analytics platforms such as Power BI or Tableau.
Deep exposure to the SAP Analytics ecosystem, including BW, HANA, and S/4.
Strong understanding of Data Modelling, KPI frameworks, and semantic layers.
Proficiency in dashboard validation and data validation techniques.
Working knowledge of SQL for data querying and analysis.
Excellent communication skills and the ability to lead high-level business discussions.
Ability to create detailed functional and analytical documentation.
0 Negotiable or Not Mentioned
India, Bengaluru
8 days ago
huemot.com
418 Views
We are seeking a highly experienced Data Engineering Lead to spearhead a critical engagement within our Capital Markets practice. Based in Bengaluru, this role involves supporting a prominent Private Equity firm headquartered in New York. The successful candidate will oversee the development and maintenance of high-impact data pipelines and lakehouse architectures using cutting-edge technologies. You will work closely with stakeholders to translate business requirements into technical specifications, ensuring high data quality and system reliability across the enterprise.
You will be responsible for leading an offshore team of 5 to 7 engineers, ensuring the delivery of production-grade data solutions through mentorship and technical oversight. This position requires deep expertise in Azure Databricks and PySpark, along with a solid understanding of data governance through Unity Catalog. Candidates must possess a strong background in U.S. Capital Markets or Private Equity to effectively meet the complex data needs of our clients. Successful applicants will demonstrate a history of architectural excellence and the ability to navigate complex financial data landscapes.
Key Requirements
15+ years of enterprise data engineering experience
Databricks Certified Data Engineer (mandatory certification)
5+ years of hands-on experience specifically on Azure Databricks
5+ years of hands-on PySpark experience with production-grade pipelines
Strong knowledge of Unity Catalog and data governance frameworks
Proven experience leading offshore teams of 5–7 engineers
Domain experience in U.S. Capital Markets, Private Equity, or Investment Management
Expertise in lakehouse architecture and modern data stack design
Advanced proficiency in SQL for complex data transformations
Strong understanding of CI/CD practices for automated data pipelines
0 Negotiable or Not Mentioned
India, Chennai
22 days ago
raahtechservices.com
1563 Views
Raahtechservices is currently seeking an experienced Senior QA Engineer to join their dynamic team in Chennai. This role is pivotal in ensuring the delivery of high-quality software through both manual and automated testing processes. The ideal candidate will be responsible for identifying defects, developing test plans, and executing comprehensive test cases across various platforms to maintain the integrity of the product.
Technically, the role requires deep expertise in tools like Playwright and Postman, as well as a strong command of SQL for database validation. Working within an Azure environment, the Senior QA Engineer will collaborate closely with developers to implement automated testing workflows. This position offers an opportunity to lead quality initiatives and contribute to the continuous improvement of the software development lifecycle in an agile setting.
Key Requirements
Manual Testing
Automation Testing
SQL
Postman
Playwright
Azure
Advanced Software Bug Tracking
Test Case Development and Execution
Regression Testing expertise
Experience in Agile Methodologies
~200,000 Mentioned
India, Bangalore
6 days ago
smartreferhub.in
428 Views
SmartReferHub is looking for a Lead Data Engineer with 6 to 8 years of experience to join their team in Bangalore. This high-impact hybrid role involves working on advanced Databricks and AWS Lakehouse architecture to lead large-scale data transformations. You will be responsible for driving enterprise-level analytics for global operations and accelerating the company's data strategy. The successful candidate will work on cutting-edge data technologies and lead impactful projects that shape the future of data engineering within the organization. Joining is expected within 30 days. The offered salary for this position ranges from ₹24 to ₹28 LPA, providing an excellent opportunity for career growth in the data analytics sector.
Key Requirements
Minimum 6 to 8 years of professional experience in data engineering roles.
Strong hands-on experience with Databricks and AWS Lakehouse architecture.
Proven track record of leading large-scale data transformations in an enterprise environment.
Ability to drive analytics solutions for global operations and cross-functional teams.
Deep expertise in Big Data technologies and cloud-based data ecosystems.
Strong proficiency in programming languages such as Python or Scala for data processing.
Expertise in writing complex SQL queries and optimizing data performance.
Solid understanding of ETL and ELT pipeline design and maintenance.
Experience with data modeling, data warehousing, and lakehouse concepts.
Strong leadership skills with the ability to manage technical projects and mentor team members.
0 Negotiable or Not Mentioned
India, Bangalore
10 days ago
fxconsulting.in
774 Views
We are seeking a highly skilled Technical Lead for Data Engineering to join our dynamic team in Bangalore. This role is centered on building and scaling high-performance data systems that support our product-driven initiatives. As a lead, you will be at the forefront of designing scalable ETL pipelines and leveraging technologies such as Spark, Hadoop, and Kafka for large-scale data processing. Your expertise will ensure that our data infrastructure is robust, efficient, and capable of handling complex data workloads.
In addition to your technical responsibilities, you will provide leadership to the engineering team and work collaboratively with Data Scientists to optimize data models and ensure top-tier data quality and security. You will be expected to monitor and troubleshoot data pipelines while maintaining high standards for data governance. The ideal candidate brings 6 to 9 years of experience, a strong background in Python or Scala, and a deep understanding of cloud platforms like AWS, Azure, or GCP. This is a fantastic opportunity for a professional looking to lead engineering excellence in a fast-paced environment.
Key Requirements
6 to 9 years of professional experience in Data Engineering.
Proven expertise in Spark and other Big Data technologies.
Proficiency in coding with Python, Scala, or Java.
Extensive experience in developing and optimizing ETL pipelines.
Hands-on experience with cloud platforms such as AWS, Azure, or GCP.
Strong knowledge of Hadoop and Kafka for large-scale data processing.
Demonstrated experience in team handling and leadership roles.
Ability to design and optimize complex data models.
Understanding of data quality, governance, and security principles.
Exceptional problem-solving skills and ability to work in fast-paced environments.
0 Negotiable or Not Mentioned
India, Calicut
30 days ago
xylemlearning.com
1873 Views
Xylem Learning is looking for a skilled Sales Business Analyst to join our dynamic team in Calicut, Kerala. In this role, you will be responsible for overseeing sales operations, managing CRM data via LeadSquared, and utilizing advanced analytical tools to drive business growth. The ideal candidate will bridge the gap between data and strategy, ensuring that our sales processes are as efficient and effective as possible within the fast-paced EdTech environment. You will work closely with leadership to provide insights that influence key business decisions.Your primary responsibilities will include using advanced Microsoft Excel techniques to identify bottlenecks in the sales funnel and implementing automated workflows to streamline operations. We are looking for a proactive professional who can take ownership of data integrity and provide the sales team with the actionable intelligence they need to succeed. By optimizing our CRM usage and reporting structures, you will help Xylem Learning reach its strategic goals while enhancing your own career in sales operations and business intelligence.
Key Requirements
1–2 years of hands-on experience with LeadSquared CRM.
2+ years of experience in a Data Analyst or Sales Operations role.
Advanced Microsoft Excel skills including pivot tables and complex formulas.
A proactive mindset to identify bottlenecks and automate workflows.
Strong analytical skills with the ability to interpret complex data sets.
Excellent communication skills to collaborate with cross-functional teams.
Proven ability to manage and maintain CRM data integrity.
Experience in the EdTech or education sector is highly preferred.
Ability to create detailed performance reports and dashboards.
Bachelor's degree in Business Administration, Data Science, or a related field.
0 Negotiable or Not Mentioned
India, Bangalore
5 days ago
careernet.in
358 Views
My client within the Pharmaceutical sector is looking to expand its technology hub located in Bangalore. We are seeking high-impact Senior AWS Data Engineers who are ready to build scalable data platforms and implement cutting-edge solutions. This role is crucial for managing the infrastructure that supports data-driven decision-making in the pharmaceutical industry and ensuring that large-scale data assets are accessible and reliable. The successful candidate will work extensively with AWS Glue, Lambda, and Databricks. You will be responsible for data modelling and processing using Python, PySpark, and SQL. This is a 100% work-from-office position in Bangalore, requiring candidates who are either currently serving their notice period or can join immediately within 30 days. Your expertise will directly contribute to the innovation of data architectures in a fast-paced environment.
Key Requirements
6–12 years of professional experience in data engineering
Expertise in AWS Glue and AWS Lambda for serverless computing
Proficiency in Databricks for unified analytics and data processing
Strong programming skills in Python for data manipulation
Advanced knowledge of PySpark for big data processing tasks
Hands-on experience with SQL for complex database queries
Proven track record in Data Modelling and architectural design
Experience in the pharmaceutical or life sciences sector
Ability to build and maintain scalable data platforms
Strong analytical and problem-solving skills in a cloud environment
0 Negotiable or Not Mentioned
India, Bengaluru
20 days ago
se-mentor.com
1093 Views
Join SE Mentor Solutions as a Big Data Engineer in Bengaluru. We are seeking a senior professional with a minimum of 8 years of experience to lead our big data initiatives. You will be at the forefront of designing and managing large-scale, complex data systems that power our organization's analytics and operations. This role demands a high level of expertise in modern data technologies and a passion for building robust, scalable architectures.
You will utilize your skills in SQL, Python, Databricks, and PySpark to develop and optimize data pipelines within the Azure Cloud ecosystem. As a senior engineer, you will also be responsible for mentoring junior staff and ensuring that all data solutions align with industry best practices. Your role is critical in transforming raw data into actionable insights while maintaining system performance and reliability. Please note that salary details were not included in the original job advertisement.
Key Requirements
Minimum of 8 years of experience in data engineering or big data roles.
Expert-level proficiency in SQL and Python.
Deep technical knowledge of Databricks and PySpark.
Significant experience working with Azure Cloud services.
Proven track record of designing and implementing large-scale data systems.
Knowledge of Hadoop ecosystem components and big data frameworks.
Ability to optimize system performance and manage large data volumes.
Experience in mentoring and leading technical teams.
Strong understanding of API integrations and data security.
Excellent problem-solving and strategic thinking capabilities.
0 Negotiable or Not Mentioned
India, Bengaluru
55 days ago
alansatechnologies.com
548 Views
Alansa Technologies is currently seeking an experienced Oracle ADF Developer to join our dynamic team in Bengaluru. In this role, you will be responsible for the end-to-end design, development, and maintenance of high-quality enterprise applications using the Oracle Application Development Framework. You will leverage your expertise in ADF Faces, Task Flows, and Business Components to build scalable solutions that integrate seamlessly with variou