0 Negotiable or Not Mentioned
India
31 days ago
gspann.com
2014 Views
GSPANN is hiring an AI Developer with over 5 years of experience to contribute to our sophisticated artificial intelligence projects. The role is based in India, with offices located in Gurugram, Hyderabad, and Pune. You will be at the forefront of developing AI/ML solutions and managing complex data processes through efficient ETL and ELT pipelines. Candidates must be proficient in Azure AI Services, Python, and modern data integration processes. You will collaborate with cross-functional teams to build and deploy intelligent software systems, ensuring high performance and scalability of AI-driven applications. This role is ideal for a developer who enjoys solving difficult problems and wants to work on the latest technologies in the cloud AI space.
Key Requirements
Minimum of 5 years of experience in AI or software development.
In-depth knowledge of Azure AI Services and cloud environments.
Strong proficiency in Python development.
Extensive experience with ETL and ELT data processes.
Solid understanding of AI/ML algorithms and their application.
Experience in designing and maintaining scalable data pipelines.
Ability to work with both structured and unstructured data sources.
Familiarity with software development lifecycles (SDLC).
Strong problem-solving skills and attention to detail.
Excellent communication and teamwork capabilities.
0 Negotiable or Not Mentioned
India, Remote
18 days ago
idtsolution.in
1829 Views
We are looking for a skilled and passionate Data Engineer to join our advanced analytics team. The ideal candidate will have hands-on experience in IICS, Snowflake, SQL, and Cloud platforms, with a strong foundation in building scalable data pipelines and modern data integration frameworks. You will play a key role in developing data-driven solutions to support AI/ML initiatives and enhance customer experience across global operations. The role requires working from 12:00 PM to 9:00 PM IST to align with team requirements.
Your core responsibilities will include designing ETL/ELT processes and managing cloud-based data solutions on platforms like AWS, Azure, or GCP. You will handle both structured and semi-structured data while implementing essential data quality and monitoring processes. Collaboration with cross-functional teams is vital to support business growth initiatives. The package for this role is up to ₹14 LPA, reflecting the expertise required to handle complex data architectures and DevOps practices within a modern data stack environment.
Key Requirements
2+ years of experience in Data Engineering.
Strong hands-on experience with Snowflake.
Proficiency in Cloud platforms (AWS, Azure, or GCP).
Expert knowledge of SQL and Python.
Extensive experience with Informatica Cloud (IICS) and ETL tools.
Proven ability to build scalable data pipelines and cloud-based data solutions.
Knowledge of serverless architectures and APIs.
Familiarity with DevOps practices including CI/CD and IaC.
Experience with streaming tools like Kafka or Spark Streaming.
Bachelor’s degree in a relevant field such as Computer Science or Engineering.
0 Negotiable or Not Mentioned
India
31 days ago
gspann.com
1482 Views
We are looking for an experienced GenAI Engineer to join the GSPANN team to work on cutting-edge generative technologies. This position is available in several key Indian locations including Gurugram, Hyderabad, and Pune. As a senior member of the team with over 6 years of experience, you will lead the development of innovative generative AI applications and frameworks that drive business value. The role requires deep expertise in Python, SQL, an
0 Negotiable or Not Mentioned
India
56 days ago
ashudividend.com
551 Views
As a Databricks Developer at Ashu Dividend, you will be a key member of our data engineering team, responsible for designing, building, and maintaining robust and scalable data pipelines. This role focuses on leveraging Azure Databricks and PySpark to drive large-scale data projects, ensuring that our data infrastructure is efficient and reliable. You will work closely with cross-functional teams to translate business requirements into technical
0 Negotiable or Not Mentioned
India
16 days ago
nishtechnologies.com
939 Views
Nish Technologies is seeking a skilled Data Engineer to join a prestigious Big4 MNC client on a full-time basis. The ideal candidate will have between 5 and 8 years of professional experience in data engineering and will be responsible for designing and implementing efficient data solutions. The primary work locations for this role are Hyderabad and Bangalore. This position requires a strong technical background and the ability to work in a fast-paced, high-impact environment.
Candidates must be proficient in Python, SQL, and PySpark to handle complex data sets and pipelines. We are conducting a virtual weekend recruitment drive on Saturday, April 4th. This role is intended for immediate joiners or those with a notice period of up to 15 days. Interested professionals are encouraged to share their profiles for consideration in this expedited hiring process.
Key Requirements
5 to 8 years of relevant experience in Data Engineering.
Advanced proficiency in Python programming for data processing.
Expertise in writing complex SQL queries for database management.
In-depth knowledge of PySpark and its application in big data projects.
Ability to join immediately or within a maximum notice period of 15 days.
Experience working within a Big4 MNC or similar large-scale environment.
Strong analytical skills to solve complex data-related problems.
Familiarity with ETL processes and data pipeline orchestration.
Excellent communication skills for cross-functional team collaboration.
Availability to participate in the virtual weekend recruitment drive on April 4th.
0 Negotiable or Not Mentioned
India
20 days ago
pyxidiatech.com
1240 Views
We are seeking a highly skilled Senior Data Analyst to join our dynamic team and play a pivotal role in designing, building, and scaling innovative data solutions across our products and client implementations. The successful candidate will be responsible for developing scalable data pipelines, optimizing ETL workflows, and ensuring the highest standards of data quality and reliability. You will work closely with cross-functional teams, including Product, Engineering, and Data Science, to drive data architecture decisions and deliver actionable insights that solve complex business challenges. This role offers the opportunity to provide technical guidance and mentorship to junior analysts while working on impactful, data-driven projects. This position is available in multiple locations across India, specifically Mumbai, Bangalore, and Pune. The ideal candidate will have over four years of experience in data-focused roles and a deep understanding of the AWS ecosystem, including Redshift, Athena, and EMR. Experience with US Healthcare Data is considered a significant advantage. Candidates must be proficient in Python, PySpark, and SQL, and possess a strong grasp of data modeling and performance optimization. If you are passionate about big data and looking to make a significant impact in a collaborative environment, we encourage you to apply.
Key Requirements
Minimum of 4 years of professional experience in Data Analytics or Data Engineering.
Demonstrated expertise in SQL and both relational and NoSQL database management.
Hands-on proficiency with Python and PySpark for processing large-scale datasets.
Proven experience in building and optimizing ETL/ELT pipelines using Airflow or AWS Glue.
Strong conceptual understanding of data modeling and performance tuning.
Advanced technical knowledge of AWS data services including S3, EMR, Redshift, and Athena.
Ability to design and manage complex data architectures across structured and unstructured sources.
Competency in maintaining high standards for data quality, validation, and monitoring.
Strategic thinking skills to perform analysis and generate actionable business insights.
Strong collaborative skills to work effectively with Product, Engineering, and Client teams.
0 Negotiable or Not Mentioned
India, Remote
15 days ago
sapphiresoftwaresolutions.com
1337 Views
We are seeking a skilled Data Engineer to join a fast-growing team supporting major global brands like KFC, Pizza Hut, and Taco Bell. This is a fully remote role based in India, with a shift schedule of 12 PM to 9 PM IST. The initial contract duration is three months, with a high probability of extension based on performance and project needs. You will be responsible for building and optimizing data pipelines using Informatica IICS and Snowflake, focusing on scalable data integration frameworks within an AWS cloud environment. The ideal candidate should have at least 2 years of experience in data engineering, with strong technical skills in Python scripting, SQL, and event-driven architectures. You will work on impactful global projects, supporting advanced analytics and AI/ML initiatives while collaborating with dedicated DevOps teams. Exposure to Airflow and streaming pipelines such as Kafka or AWS Streaming is highly desirable. This is an excellent opportunity to work on modern data platforms and drive data-driven decisions for world-class organizations.
Key Requirements
2+ years of professional Data Engineering experience.
Proficiency in Informatica Cloud (IICS) as the primary ETL tool.
Hands-on experience with Snowflake as a target data platform.
Strong expertise in AWS (Amazon Web Services) cloud environment.
Advanced knowledge of SQL for complex data queries and manipulation.
Solid programming skills in Python for scripting and automation.
Experience handling both structured and semi-structured data formats.
Familiarity with REST APIs and AWS Lambda functions.
Ability to work the 12 PM to 9 PM IST shift.
Capacity to collaborate effectively with cross-functional teams and DevOps.
0 Negotiable or Not Mentioned
India, Hyderabad
56 days ago
spectraforce.com
550 Views
Leoforce is an AI-driven product company building intelligent talent acquisition and recruitment automation platforms for enterprises. We are seeking a Data Scientist to join our team in Hyderabad. In this role, you will work at the intersection of machine learning, generative AI, LLM-powered reasoning, and agentic AI automation, solving real enterprise hiring problems at scale. You will be responsible for building and deploying machine learning
0 Negotiable or Not Mentioned
India, Hyderabad
54 days ago
p99soft.com
541 Views
We are looking for a passionate AI Engineer with hands-on experience in Generative AI to join our growing team in Hyderabad. The successful candidate will design and develop end-to-end generative AI solutions on AWS, focusing on Large Language Models (LLMs), Retrieval-Augmented Generation (RAG) pipelines, and prompt engineering. You will work with cutting-edge agentic workflows and modern frameworks to build real-world AI applications. This hybri
0 Negotiable or Not Mentioned
India, Pan India
28 days ago
ampcustech.com
1456 Views
Ampcustech is currently seeking a highly skilled GenAI Solution Architect to join our innovative team in India on a hybrid basis. The ideal candidate will be responsible for designing and implementing scalable, enterprise-grade AI-driven solutions utilizing Large Language Models (LLMs) and various cloud platforms. You will play a pivotal role in bridging the gap between cutting-edge AI research and production-ready applications, working closely with both engineering and product management teams to deliver high-impact results. The role involves building diverse use cases including advanced document processing, automated summarization, intelligent Q&A systems, AI copilots, and workflow automation. You will define technical reference architectures for LLMs, vector stores, and orchestration layers, ensuring that all GenAI services are seamlessly integrated into existing data platforms. Furthermore, you will lead efforts in prompt engineering and optimization while maintaining a strong focus on security, privacy, and responsible AI ethics. This is an exciting opportunity to drive innovation in the healthcare domain and beyond.
Key Requirements
4–7 years of experience in software, data, or cloud architecture roles.
Hands-on experience with Generative AI and LLM-based solutions.
Proficiency with Azure OpenAI, OpenAI APIs, and LangChain.
Experience with Semantic Kernel or similar orchestration frameworks.
Strong understanding of REST APIs and cloud-based architectures.
Ability to collaborate effectively with engineering and product teams.
Strong communication skills to explain AI concepts to non-technical stakeholders.
Proficiency in prompt engineering, evaluation, and optimization techniques.
Experience defining reference architectures for vector stores and orchestration layers.
Commitment to ensuring security, privacy, and responsible AI practices.
0 Negotiable or Not Mentioned
India
9 hours ago
emperentech.com
46 Views
Emperen Technologies is seeking elite Databricks talent to join our global team of experts. As an Official Databricks Partner, we specialize in helping enterprises scale their data transformation initiatives faster, smarter, and more cost-efficiently. We are looking for professionals who can hit the ground running on a contract or hourly basis to meet urgent delivery needs for our diverse portfolio of enterprise clients.
Candidates will be responsible for leveraging Azure Databricks, Spark, and PySpark to build robust data pipelines and architectures. The role involves deep involvement in data migration, modernization, and the integration of AI/ML models into existing business analytics frameworks. You will work closely with Data Engineers and Architects to enable outcomes that drive business value. If you possess deep technical capability and a proven track record in data initiatives, we encourage you to apply.
Key Requirements
Proficiency in Azure Databricks and Apache Spark ecosystems.
Strong experience with PySpark for large-scale data processing.
Solid background in Data Engineering and Data Architecture principles.
Expertise in Data Migration and Modernization of legacy systems.
Ability to integrate AI/ML and Analytics into production data pipelines.
Available to work on a Contract and Hourly Basis for urgent delivery.
Strong communication skills for collaborating with CTOs and Head of Data roles.
Experience with cloud infrastructure and security best practices.
Proven ability to deliver high-quality talent outcomes in fast-paced environments.
Knowledge of Spark optimization and performance tuning techniques.
0 Negotiable or Not Mentioned
India, Remote / Hybrid
11 days ago
infogine.com
655 Views
Join a forward-thinking team dedicated to developing cutting-edge LLM-based solutions, autonomous AI agents, and RAG pipelines that aim to redefine the future of enterprise AI. As a Gen AI / LLM Specialist, you will be at the forefront of technical innovation, applying advanced AI techniques to solve complex business problems. This role requires a balance of research-oriented thinking and practical application to build robust, scalable AI systems. Possible work locations include remote or hybrid arrangements within India.
The ideal candidate will demonstrate technical leadership in fine-tuning large language models and optimizing prompt engineering workflows. You will collaborate with cross-functional teams to integrate Vector Databases and API frameworks into production-ready environments. A significant portion of the role involves ensuring Responsible AI practices and Agent Safety, maintaining high standards for the ethical deployment of autonomous systems. This position is designed for experienced professionals with a solid foundation in Python, NLP, and Deep Learning who are ready to take on a general shift role in a high-impact environment.
Key Requirements
6–10 years of professional experience in software development or AI research.
Demonstrated hands-on expertise with LLM Fine-Tuning and Prompt Engineering.
Advanced proficiency in Python programming for data science and AI applications.
Strong background in Natural Language Processing (NLP) and Deep Learning methodologies.
Proven experience building and optimizing Retrieval-Augmented Generation (RAG) pipelines.
Practical knowledge of Vector Databases such as Pinecone, Milvus, or Weaviate.
Experience designing and implementing robust API Frameworks for AI model integration.
Deep understanding of Responsible AI principles and safety protocols for autonomous agents.
Ability to work effectively in a general shift schedule within a hybrid or remote setup.
Strong problem-solving skills and the ability to translate business requirements into technical AI solutions.
0 Negotiable or Not Mentioned
India, Hyderabad
17 days ago
xautomations.com
678 Views
xautomations is seeking a Senior Python Developer to join our team in Hyderabad. In this role, you will be responsible for building and maintaining real-world systems at scale, focusing on real-time processing and high-performance data pipelines. You will work closely with other engineers to design robust architectures that handle large volumes of data efficiently while ensuring system stability and speed.
As a key member of our engineering team, you will contribute to the development of our core platforms and be involved in the full software development lifecycle. We value developers who are passionate about clean code, scalable infrastructure, and innovative problem-solving. This is a full-time, work-from-office position offering the opportunity to tackle complex technical challenges in a dynamic and fast-paced environment.
Key Requirements
Proficiency in Python programming and related frameworks.
Extensive experience with data pipelines and real-time systems.
Proven ability to build and maintain systems at scale.
Strong debugging and troubleshooting skills for complex applications.
Experience with both SQL and NoSQL database management.
Familiarity with high-performance platforms and low-latency code.
In-depth knowledge of RESTful API design and implementation.
Experience with cloud platforms such as AWS, Azure, or GCP.
Strong understanding of the complete software development lifecycle (SDLC).
Ability to work effectively in a collaborative office environment.
0 Negotiable or Not Mentioned
India, Remote
56 days ago
myskysys.com
553 Views
The Senior Data Operations Engineer role is a 100% remote full-time contract position designed for candidates located in India. The successful candidate will collaborate closely with business partners to establish system connectivity and oversee the entire lifecycle of data pipelines, from initial collection to the final deployment of data models. This role requires a professional who can monitor pipeline performance, resolve bottlenecks, and imp
0 Negotiable or Not Mentioned
India
8 days ago
gmail.com
657 Views
SearchMate is partnering with an elite client to recruit a Senior SDET specialized in ETL environments and Backend Automation. This is a high-impact role aimed at Selection-Ready professionals capable of managing complex data pipelines and developing robust automation frameworks. The candidate will work under a hybrid model with office locations available in Chennai, Hyderabad, and Pune. The selected candidate will join the client's payroll for an initial duration of 6 months, with the possibility of extension based on performance and project needs. Responsibilities include ensuring the architectural integrity of massive data migrations and performing automated validation for a global enterprise. This role offers a strategic path into high-tier US product environments.
Key Requirements
7+ Years of overall IT experience in Quality Engineering.
Proven hands-on experience working as a QA Tester in ETL environments.
Strong proficiency in SQL.
Deep experience in RDBMS databases such as Oracle or SQL Server.
Hands-on expertise in Java-Selenium or Python automation testing.
Fluent in Agile workflows and cross-functional team collaboration.
Exceptional English communication skills for stakeholder interfacing.
Ability to bridge gaps between complex data pipelines and automation frameworks.
Experience in architectural integrity for massive data migrations.
Proactive mindset for high-tier US product environment standards.
0 Negotiable or Not Mentioned
India, Remote
6 days ago
tekvo.io
557 Views
Tekvo is looking for a seasoned and highly motivated Azure Technical Lead to spearhead our cloud data engineering projects. In this critical role, you will be the driving force behind large-scale Azure analytics initiatives, overseeing the end-to-end development of high-impact data platforms. You will be responsible for defining the solution design, ensuring technical excellence across the delivery lifecycle, and providing strategic guidance to engineering teams. Your expertise will directly contribute to the creation of scalable, robust, and efficient data architectures that empower our clients to make data-driven decisions.
As an Azure Technical Lead, you must demonstrate mastery over core Azure data services such as Data Factory, Synapse, and Databricks. The role demands a blend of deep technical proficiency in SQL and Python along with the leadership skills required to mentor developers and manage complex stakeholder expectations. This remote position offers a unique opportunity for professionals based in India to work on cutting-edge cloud technologies within a collaborative environment. Successful candidates will be expected to maintain high standards of code quality and architectural integrity while driving innovation in the data engineering space.
Key Requirements
Possess 10-14 years of professional experience in data engineering and cloud platforms.
Demonstrate expert-level proficiency in designing and implementing Azure Data Factory pipelines.
Have hands-on experience with Azure Synapse Analytics for enterprise data warehousing.
Show strong technical expertise in using Azure Databricks for big data processing.
Maintain advanced knowledge of SQL for complex data manipulation and performance tuning.
Exhibit proficiency in Python programming for automating data workflows and engineering tasks.
Prove a track record of leading and delivering large-scale analytics initiatives on Azure.
Possess strong solution design skills with the ability to create scalable data architectures.
Demonstrate the ability to guide, mentor, and manage high-performing technical teams.
Experience in cloud security best practices and data governance frameworks is highly preferred.
Excellent communication skills to interact with stakeholders and translate business needs into technical solutions.
0 Negotiable or Not Mentioned
India
52 days ago
clogicsofttech.com
534 Views
As an AI/ML Engineer at Clogic Softtech, you will be at the forefront of technological innovation, working for a top-tier MNC. You will design, implement, and optimize machine learning models and artificial intelligence systems to solve complex business problems. This role involves data processing, model evaluation, and staying updated with the latest advancements in the AI field to ensure our solutions remain cutting-edge. We are looking for pro
0 Negotiable or Not Mentioned
India, Remote
54 days ago
vgroinfotech.in
541 Views
Vgro Infotech is seeking serious and dedicated interns for an Application Development role. This is a hands-on internship designed for those who want real-world exposure to writing production code and building scalable web applications. Unlike traditional internships, this program focuses on live client projects and internal platforms where your work will have a direct impact and go live. The work environment is fast-paced and outcome-driven, off
0 Negotiable or Not Mentioned
India
55 days ago
adbornsolutions.com
546 Views
Adborn Solutions is currently seeking a highly skilled and experienced Senior Developer to join their esteemed client's team. This role is designed for a self-driven technologist with a passion for writing great code and a strong command over modern web technologies. The ideal candidate will be hands-on with full-stack development, capable of writing scalable, maintainable code, and contributing to architectural decisions that drive business impa
0 Negotiable or Not Mentioned
India
31 days ago
gspann.com
1695 Views
GSPANN is currently seeking five dedicated AI Ops Engineers to join our team across several locations in India, including Gurugram, Hyderabad, Pune, and Delhi NCR. This role focuses on optimizing AI operations and ensuring the seamless integration of machine learning models into production environments through robust automation and monitoring strategies. As part of a forward-thinking team, you will play a crucial role in maintaining the reliabili
0 Negotiable or Not Mentioned
India
31 days ago
mobilutionit.com
1638 Views
Mobilution IT Systems is currently seeking a skilled and experienced Backend Developer to join our dynamic team on a contract basis. This role is ideal for professionals with 4 to 10 years of experience who are proficient in Python and modern web frameworks. As a Backend Developer, you will be responsible for designing, developing, and maintaining robust server-side logic and ensuring high performance and responsiveness to requests from the front
0 Negotiable or Not Mentioned
India
55 days ago
MINDTECHDIGITAL.COM
546 Views
OSSS Consulting Services India Pvt. Ltd is currently seeking an experienced Senior Developer to join their esteemed client's team. This role is designed for a technical expert with approximately 15 years of experience who excels at building scalable, high-performance backend systems. You will be responsible for designing robust and maintainable architectures, collaborating across functional teams, and ensuring the overall security and reliability
0 Negotiable or Not Mentioned
India, Hyderabad
28 days ago
esquareinfo.com
1310 Views
We are seeking a seasoned Python Developer to join our team in Hyderabad to work on complex, business-critical systems. This role requires a high level of technical expertise and the ability to collaborate with various cross-functional teams to ensure the successful delivery of software solutions. The developer will be responsible for managing multiple projects across different domains, ensuring that all systems are robust and scalable to meet enterprise demands.
The ideal candidate should have a strong background in Python development and significant experience with cloud platforms such as AWS and Azure. Exposure to AI and data-driven applications is highly desirable, as it will play a key role in the future direction of our technical landscape. This position offers a hybrid work model, combining the benefits of remote work with the collaborative environment of our Hyderabad office. Successful applicants will demonstrate a proactive approach to problem-solving and a commitment to maintaining high standards of software quality in an enterprise environment.
Key Requirements
6 to 8 years of professional experience in Python development.
Proven expertise in working with AWS cloud services.
Significant experience with Microsoft Azure platforms.
Demonstrated history of developing AI or Data-driven applications.
Experience working on business-critical systems and enterprise software.
Strong ability to collaborate with cross-functional teams and stakeholders.
Competency in managing and delivering multiple projects across various domains.
Excellent problem-solving and analytical skills in a technical environment.
Solid understanding of software development life cycle (SDLC) best practices.
Proficiency in database management and system integration.
0 Negotiable or Not Mentioned
India
53 days ago
talent-wing.com
535 Views
Talent Wing Consultants is urgently hiring a Backend Developer Lead to oversee complex technical projects and lead a team of talented software engineers. This role requires a seasoned professional with a deep understanding of backend systems, architectural design, and modern software development practices. The successful candidate will be instrumental in building scalable, robust systems that support high-performance applications and drive techno
0 Negotiable or Not Mentioned
India, Remote
4 days ago
adroitinnovative.com
260 Views
Adroit Innovative is seeking a highly skilled IAM Lead to spearhead our identity and access management initiatives. This senior-level position requires a professional with at least 8 years of experience in software engineering, specifically emphasizing a leadership role within IAM, authorization, or security infrastructure for the past 3 years. The ideal candidate will be adept at working in a remote environment while supporting our operations across key Indian hubs, including Hyderabad, Bengaluru, Pune, and Chennai. You will be responsible for designing and implementing complex security architectures, focusing on PBAC and ReBAC frameworks using advanced tools such as Styra DAS, OpenFGA, or Zanzibar.
As an IAM Lead, you will utilize your deep technical expertise in OAuth 2.0, OpenID Connect, and token-based authentication to secure enterprise systems. The role involves extensive collaboration with cross-functional teams to explore and analyze complex database schemas and integrate IAM/IGA platforms like ForgeRock, SailPoint, or Saviynt. We are looking for an individual who can produce comprehensive architecture diagrams and technical reports while ensuring compliance with global standards such as GDPR, SOX, and FAPI 2.0. This is an immediate-joiner role that offers the opportunity to work with cutting-edge cloud environments like AWS, GCP, and Azure to build scalable and secure access control solutions.
Key Requirements
Minimum of 8 years in software engineering with a focus on security.
At least 3 years in a lead role focused on IAM, authorization, or security infrastructure.
Hands-on experience with PBAC or ReBAC implementations using Styra DAS or OpenFGA.
Strong working knowledge of OAuth 2.0, OpenID Connect, JWT, and SAML.
Proficiency in relational databases such as SQL Server, PostgreSQL, or MySQL.
Familiarity with IAM/IGA platforms like RadiantLogic, ForgeRock, or SailPoint.
Advanced proficiency in at least one backend language like Java, C#/.NET, or Python.
Extensive experience working within cloud environments such as AWS, GCP, or Azure.
Ability to produce detailed architecture diagrams and technical comparison matrices.
Knowledge of compliance standards including SOX, GLBA, GDPR, and FAPI 2.0.
Ability to start immediately (Immediate Notice Period).
0 Negotiable or Not Mentioned
India, Remote
5 days ago
adysunventures.com
311 Views
Adysun Ventures Pvt. Ltd. is currently seeking a skilled FastAPI Developer to design and implement high-performance, scalable REST APIs. In this role, you will be responsible for crafting schemas and validation layers using Pydantic, building asynchronous services with Python, and ensuring seamless integration with PostgreSQL and SQLAlchemy. The position involves working on robust backend systems that power our core services, requiring a focus on clean engineering practices and technical excellence.
This is a full-time opportunity based in India, with flexible work locations including Pune and Mumbai, or the option for fully remote work. Candidates should possess between 2 to 4 years of professional experience in backend or API development. Beyond core development, you will be involved in writing comprehensive tests with pytest, managing containerized deployments with Docker, and utilizing Redis for background task processing. This role is ideal for developers who are passionate about modern Python ecosystems and building reliable infrastructure.
Key Requirements
Proficiency in Python development with a focus on FastAPI and Pydantic.
Extensive experience with asynchronous programming using async/await syntax.
Strong knowledge of database management using SQLAlchemy and PostgreSQL.
Ability to design and maintain RESTful API endpoints and data validation schemas.
Experience writing unit and integration tests using the pytest framework.
Hands-on experience with Docker for containerization and consistent environment deployment.
Familiarity with Redis and managing background tasks for distributed systems.
Minimum of 2 to 4 years of relevant experience in backend or API-centric roles.
Understanding of version control systems, specifically Git, for collaborative development.
Knowledge of CI/CD pipelines to streamline the software delivery process.
Excellent problem-solving skills and a commitment to high-quality software engineering standards.
0 Negotiable or Not Mentioned
India
11 days ago
axtria.com
650 Views
Axtria is expanding its Market Mix (MMx) Analytics team and is looking for high-impact analytics professionals passionate about the pharmaceutical and life sciences sector. As an integral part of the team, you will be responsible for building and leading Market Mix and Promotion Response models that drive strategic decisions for global clients. This role involves deep collaboration with international stakeholders and delivering high-quality, actionable insights through advanced analytics and data science methodologies. Candidates will work on cutting-edge AI/ML-driven analytics platforms and have the opportunity for continuous learning through the Axtria Institute.
The role also involves mentoring junior analysts and contributing to the overall growth of Axtria’s analytics capabilities in a transparent and collaborative culture. Possible work locations for this role include Gurgaon, Bangalore, Noida, Pune, and Hyderabad. At Axtria, we prioritize high-impact analytical work that directly influences pharmaceutical commercial excellence and marketing efficiency. You will be expected to utilize your expertise in Python or R and statistical modeling to provide best-in-class results for our partners. This is a unique opportunity to grow professionally within a supportive and innovative environment focused on health and life sciences.
Key Requirements
Minimum 2–6 years of experience in Marketing Analytics.
At least 2 years of hands-on experience with MMx, PRM, or Test & Control models.
Proficiency in Python or R programming for data analysis.
Strong foundations in statistics and statistical modeling.
Solid understanding of Machine Learning (ML) techniques.
Experience in Pharma or Life Sciences industry is preferred.
Proven ability to build and lead Market Mix and Promotion Response models.
Experience in partnering with global and client stakeholders for delivery.
Ability to deliver client-ready analytics and actionable insights.
Experience in mentoring junior analysts and contributing to analytics platforms.
0 Negotiable or Not Mentioned
India, Remote
27 days ago
e-solutionsinc.com
1780 Views
E-Solutions Inc is hiring a Senior LLM S2 Annotator (CUA Trajectory Specialist) for a temporary five-week engagement. This remote position involves working with advanced AI systems and agentic workflows to decompose complex technical instructions into clear, structured steps. The role requires a candidate with a strong technical background in software development or technical support, capable of maintaining high-quality documentation in a fast-paced environment.
The specialist will operate within Linux environments and utilize scripting languages such as Python or Bash to manage technical tasks. A key responsibility is managing trajectories using tools like OpenClaw while ensuring detailed documentation of all technical processes. Candidates must be prepared to work an eight-hour daily shift that includes a four-hour overlap with the PST time zone to facilitate collaboration with the primary engineering team.
Key Requirements
2–5 years of experience in software development, technical support, or similar technical roles.
Strong familiarity with Linux environments and command-line operations.
Proficiency in at least one scripting language: Python or Bash.
Ability to decompose complex instructions into structured, step-by-step workflows.
Strong attention to detail in documenting technical processes.
Exposure to LLM-based tools, AI systems, or agentic workflows.
Basic understanding of APIs, file systems, and developer tooling.
Familiarity with OpenClaw or similar environments and tools.
Availability to work 8 hours per day with a 4-hour overlap with PST time zone.
Senior level proficiency in technical troubleshooting and problem-solving.
0 Negotiable or Not Mentioned
India, Remote
23 days ago
e-solutionsinc.com
1636 Views
We are seeking a highly skilled LLM S2 Annotator (CUA Trajectory Specialist) to join our team for a 5-week project. This role focuses on utilizing technical expertise to evaluate and annotate LLM trajectories within agentic workflows. The successful candidate will work extensively with tools like OpenClaw and must be comfortable navigating Linux environments using command-line operations. The position requires a daily commitment of 8 hours, ensuring a 4-hour overlap with the PST time zone to facilitate seamless collaboration with our global development team.
Candidates should possess a strong background in software development or technical support, with specific proficiency in Python or Bash scripting. Your primary responsibility will be decomposing complex technical instructions into structured, step-by-step workflows and documenting technical processes with extreme precision. This is a remote opportunity specifically open to candidates in this region, offering a chance to contribute to cutting-edge AI system development and the evolution of LLM-based agentic tools.
Key Requirements
2–5 years of experience in software development, technical support, or similar technical roles.
Strong familiarity with Linux environments and command-line operations.
Proficiency in at least one scripting language: Python or Bash.
Ability to decompose complex instructions into structured, step-by-step workflows.
Strong attention to detail in documenting technical processes.
Exposure to LLM-based tools, AI systems, or agentic workflows.
Basic understanding of APIs, file systems, and developer tooling.
Familiarity with OpenClaw or similar environments/tools.
Ability to work 8 hours per day with a 4-hour overlap with the PST time zone.
Senior level experience in technical environments.
0 Negotiable or Not Mentioned
India
2 days ago
hiringeye.com
233 Views
We are looking for a skilled AI/ML Technical Lead to drive the design, development, and deployment of AI solutions. This role involves leading a team, working on the end-to-end machine learning lifecycle, and delivering scalable, high-impact solutions to solve complex business problems. The candidate will be responsible for mentoring a dedicated team of engineers and ensuring the implementation of best practices in model development and monitoring. Possible work locations for this role include Delhi NCR, Bangalore, Pune, and Hyderabad. The ideal candidate should possess 8 to 12 years of relevant experience and be prepared for immediate joining or currently serving a notice period. The role requires working across various domains such as Natural Language Processing, Computer Vision, and Generative AI while optimizing performance and scalability. Expertise in MLOps and cloud platforms like AWS, Azure, or GCP is essential for success in this position. The designated shift timing for this role is from 2:00 PM to 11:00 PM IST.
Key Requirements
Lead and mentor a team of AI/ML engineers to achieve project milestones
Design, build, and deploy sophisticated machine learning models for production
Work across NLP, Computer Vision, and Generative AI use cases effectively
Optimize models for maximum performance, accuracy, and enterprise scalability
Collaborate with cross-functional teams including Data, Engineering, and DevOps
Ensure best practices in MLOps, including deployment and continuous monitoring
Strong experience in Python programming and frameworks like TensorFlow or PyTorch
Solid understanding of ML, Deep Learning, and specialized AI subfields
Hands-on experience with cloud platforms such as AWS, Azure, or GCP
Proven experience in handling large datasets and complex feature engineering