0 Negotiable or Not Mentioned
Remote
14 days ago
asofttek.com
843 Views
We are seeking a skilled Data Modeler for an urgent role supporting a major healthcare client. This remote position focuses on the end-to-end design of flattened schemas specifically optimized for Google Cloud Platform environments, including BigQuery and downstream serve views. You will be instrumental in creating mapping patterns that transform nested FHIR structures into high-performance relational or semi-relational models while ensuring the highest level of data integrity.
The successful candidate will establish comprehensive data modeling standards and reusable patterns to ensure scalability across various clinical domains. You will work in close partnership with engineering teams to ensure that all data models effectively support high-concurrency analytics and modern reporting tools. This is an excellent opportunity for a professional with a strong background in healthcare data architecture and No-SQL transformation to contribute to a scalable, cloud-native data environment.
Key Requirements
Expertise in data modeling specifically for Google Cloud Platform (GCP) tools like BigQuery and Dataflow.
Proven experience transforming No-SQL and nested structures such as JSON and FHIR into flattened schemas.
Strong understanding of healthcare data architectures and advanced normalization strategies.
Ability to own the end-to-end design of flattened schemas optimized for high-performance analytics.
Experience developing mapping patterns for relational and semi-relational data models.
Skill in establishing data modeling standards and reusable patterns for enterprise scalability.
Collaborative mindset to partner with engineering teams for high-concurrency reporting support.
A Bachelor's or Master's degree in Information Technology, Computer Science, or a relevant technical field.
Proficiency in SQL and managing large-scale datasets in a cloud-native environment.
Strong analytical and problem-solving skills applied to complex clinical data domains.
0 Negotiable or Not Mentioned
Remote
14 days ago
gvrinfotek.com
839 Views
We are seeking a highly experienced Senior Developer / Architect for a long-term remote position. This role requires the candidate to operate within EST hours and focus on building and maintaining high-volume transaction systems. The successful candidate will lead architectural decisions and contribute to a robust technical environment utilizing modern cloud technologies and data streaming platforms. The project is expected to last for a duration of 12 or more months.The ideal candidate will possess over nine years of professional experience in Java development and a strong background in Google Cloud Platform services such as BigQuery. Key responsibilities include designing and optimizing PostgreSQL databases and implementing Kafka for efficient data streaming. Additionally, exposure to AI and automation technologies is highly valued as we look to integrate smarter processes into our software lifecycle. The recruitment process includes an online assessment followed by two rounds of virtual interviews.
Key Requirements
At least 9 years of professional experience in Java development.
Extensive expertise in Google Cloud Platform (GCP) services.
Proven experience with BigQuery for large-scale data analysis.
Proficiency in Kafka for real-time data streaming and processing.
Advanced skills in PostgreSQL database design and performance optimization.
Strong background in architecting and maintaining high-volume transaction systems.
Familiarity or exposure to AI and automation technologies in a production environment.
Ability to work and collaborate effectively within EST timezone hours.
Experience in leading software architectural decisions and implementation.
Capability to complete complex online technical assessments and virtual interviews.
0 Negotiable or Not Mentioned
Remote
14 days ago
tanuinfotech.com
979 Views
Join our team as a Google Dialogflow CX Expert / Conversational AI Engineer to design and deliver scalable, enterprise-grade virtual assistant experiences powered by advanced conversational design and cloud integrations. In this role, you will be responsible for designing and building conversational flows using Dialogflow CX, configuring intents, pages, routes, parameters, and events, and managing conversation state, routing, and session continuity. You will collaborate with backend, platform, and product teams to integrate conversational flows with backend APIs and services, improving chatbot performance, fallback handling, and user experience.
The ideal candidate will possess over 8 years of experience and a strong background in cloud-based application environments. You will optimize conversation quality and escalation paths, establish best practices for scalable conversational design, and document architecture, workflows, and platform standards. This full-time position is remote and requires strong debugging and troubleshooting skills along with excellent communication capabilities. Experience with Google Cloud services such as Cloud Run, Firestore, and IAM, as well as exposure to LLMs and Agentic AI, will be highly valued as we continue to drive digital transformation through automation and AI engineering.
Key Requirements
Strong hands-on experience with Google Dialogflow CX
Expertise in conversational design and stateful flow management
Experience integrating with webhooks and REST APIs
Strong understanding of JSON request and response handling
Experience in cloud-based application environments
Strong debugging and troubleshooting skills
Excellent communication and collaboration skills
Minimum of 8 years of professional experience in conversational AI or related field
Ability to manage conversation state, routing, and session continuity
Familiarity with Google Cloud services like Cloud Run, Firestore, and IAM
0 Negotiable or Not Mentioned
Remote
12 days ago
zelarsoft.com
1484 Views
Zelarsoft is looking for a passionate Junior DevOps Engineer to work on real-time cloud infrastructure projects. In this role, you will be responsible for handling GCP-based deployments, CI/CD automation, and ensuring end-to-end task ownership for various infrastructure projects. This position provides an excellent opportunity to collaborate with senior engineers and gain valuable hands-on exposure to Site Reliability Engineering (SRE) and enterprise systems within a professional IT environment. The ideal candidate will have 2 to 3 years of experience in managing cloud environments and be familiar with modern DevOps tooling and practices. You will work within a team focused on delivering high-quality, scalable cloud solutions and will participate in the continuous improvement of infrastructure delivery. This role is perfect for a cloud enthusiast looking to grow their technical skill set while contributing to meaningful projects. Interested candidates should submit their updated resume to the provided email address for consideration.
Key Requirements
At least 2 – 3 years of professional experience in DevOps or cloud infrastructure roles.
Hands-on experience with Google Cloud Platform (GCP) services and deployments.
Proficiency in managing CI/CD pipelines and automation workflows.
Strong knowledge of container orchestration tools like Kubernetes.
Experience with Infrastructure as Code (IaC) using tools such as Terraform.
Solid understanding of Site Reliability Engineering (SRE) principles.
Ability to take end-to-end ownership of technical tasks and deployments.
Experience with version control systems such as Git for code management.
Familiarity with scripting languages like Bash, Python, or Go for automation.
Excellent communication and collaboration skills for working with senior engineering teams.
0 Negotiable or Not Mentioned
Remote
27 days ago
intellisofttech.com
1687 Views
We are seeking a highly skilled Data Architect specializing in Google Spanner for a long-term contract engagement. In this remote role, you will be responsible for owning the target-state data architecture, ensuring that the new data models meet the highest standards of functional correctness and performance. You will play a critical role in defining the architectural roadmap before large-scale application rewrites begin, catering to the needs of over 1,000 diverse applications.
The ideal candidate will have extensive experience with transactional semantics and performance SLOs within the Google Cloud ecosystem. You will collaborate with cross-functional teams to ensure that the database infrastructure supports high-volume, mission-critical operations. This is an excellent opportunity to influence the data strategy of a major enterprise environment while working in a flexible, remote setting.
Key Requirements
Expertise in Google Spanner architecture and implementation.
Proven experience in designing target-state data architectures.
Proficiency in data modeling for functional correctness.
Deep understanding of performance Service Level Objectives (SLOs).
Extensive knowledge of transactional semantics in distributed databases.
Experience managing data architecture for large-scale app ecosystems (1,000+ apps).
Strong background in cloud-native database solutions.
Ability to lead architectural decisions prior to large-scale application rewrites.
Excellent communication and collaboration skills for remote work.
Analytical mindset with a focus on scalability and reliability.
0 Negotiable or Not Mentioned
Remote
17 days ago
ekcelsystems.com
1578 Views
Ekcel Systems has an immediate opening for a UKG Specialist with expertise in Dell Boomi for a long-term remote role. This position focuses on the integration aspect of the UKG software suite, utilizing the Dell Boomi platform to automate workflows and synchronize data across various HR and business applications. You will be responsible for the full lifecycle of integration projects, from initial design to deployment and monitoring.
You will work to build robust middleware solutions that enhance the functionality of UKG products. This role is perfect for a technical professional who enjoys solving complex integration puzzles and ensuring that data flows accurately and securely between systems. As a remote role, it offers flexibility while requiring a high degree of accountability and technical excellence. Salary information is not included in this posting.
Key Requirements
Advanced proficiency in the Dell Boomi integration platform (AtomSphere).
Experience building integrations specifically between UKG and 3rd party apps.
Strong understanding of REST/SOAP APIs and standard data formats like JSON/XML.
Capability to design, develop, test, and deploy Boomi processes efficiently.
Knowledge of UKG Pro or UKG Dimensions integration entry points.
Ability to implement robust error handling and automated monitoring solutions.
Strong technical documentation skills for architectural and process flows.
Ability to collaborate with cross-functional teams in a virtual environment.
Experience with enterprise-level middleware solutions and data mapping.
Bachelor’s degree in Computer Science, Information Systems, or a related field.
0 Negotiable or Not Mentioned
Remote
12 days ago
zelarsoft.com
873 Views
We are seeking a passionate and dedicated Junior DevOps Engineer to join our growing team. In this role, you will work on real-time cloud infrastructure projects, focusing primarily on Google Cloud Platform (GCP) environments. You will be responsible for handling deployments, managing CI/CD automation pipelines, and ensuring the smooth operation of enterprise systems through end-to-end task ownership. This is an excellent opportunity to collaborate closely with senior engineers and gain deep hands-on exposure to SRE practices. Your daily responsibilities will involve monitoring cloud health, troubleshooting infrastructure issues, and optimizing resource usage using tools like Kubernetes and Terraform. As a Junior DevOps Engineer, you will contribute to the evolution of our deployment strategies and help maintain a robust, scalable architecture. We value continuous learning and offer a supportive environment where you can refine your skills in modern cloud technologies while delivering impactful solutions for our clients.
Key Requirements
At least 2 to 3 years of professional experience in DevOps or Cloud roles.
Proven hands-on experience with Google Cloud Platform (GCP) services.
Strong proficiency in building and maintaining CI/CD pipelines.
Direct experience with Kubernetes orchestration and container management.
Knowledge of Infrastructure as Code (IaC) tools, specifically Terraform.
Familiarity with Site Reliability Engineering (SRE) concepts and practices.
Ability to demonstrate end-to-end task ownership and project delivery.
Strong scripting capabilities in languages such as Python or Shell.
Understanding of cloud networking, security, and monitoring tools.
Excellent collaborative skills to work effectively with senior engineering teams.
0 Negotiable or Not Mentioned
Remote
25 days ago
akunth.com
3177 Views
As a Database Administrator (DBA) at Akunth, you will play a pivotal role in managing, optimizing, and securing our extensive database environment. Your primary mission will be to ensure the high performance, availability, and reliability of our data systems, supporting the organization's critical business functions. This role involves a blend of proactive maintenance, such as installing and configuring database systems, and reactive problem-solving through troubleshooting and incident resolution. You will collaborate closely with development and data teams to align database strategies with application requirements and organizational goals.
In addition to day-to-day operations, you will be responsible for designing and implementing robust backup, recovery, and disaster recovery strategies to safeguard our data integrity. The ideal candidate will have a strong background in both relational and NoSQL databases, with a keen eye for performance tuning and security best practices. We offer a flexible work environment that supports remote and hybrid arrangements, allowing you to innovate and grow within a collaborative, data-focused culture. Your expertise will directly contribute to powering our data and driving overall performance across the company.
Key Requirements
Between 4 to 10 years of professional experience in Database Administration.
Strong experience with multiple database systems including MySQL, PostgreSQL, Oracle, and SQL Server.
High proficiency in SQL and advanced database performance tuning techniques.
In-depth knowledge of database backup, recovery, and disaster recovery planning.
Solid understanding of database security, access control, and data integrity principles.
Experience with high availability configurations and database replication techniques.
Proven ability to manage database upgrades, patches, and complex data migrations.
Familiarity with cloud database platforms such as AWS RDS, Azure SQL, or Google Cloud SQL.
Knowledge of NoSQL database technologies like MongoDB or Cassandra.
Ability to automate tasks using scripting languages such as Python or Bash.
0 Negotiable or Not Mentioned
Remote
19 days ago
airshelf.ai
854 Views
AirShelf is offering a rare opportunity for talented AI builders to join a high-impact team. This role provides the unique chance to collaborate with experienced professionals who have successfully scaled global platforms reaching over 140 countries. We are looking for individuals who are truly AI-native and have demonstrated their skills through real-world projects rather than just theoretical certifications or introductory courses. As an AI Builder at AirShelf, you will be responsible for designing and developing sophisticated AI-driven solutions. You will be expected to showcase what you have actually built, as we prioritize hands-on experience and proven results. This is a role for creators and innovators who want to push the boundaries of what is possible in the AI space and contribute to a platform with a significant global footprint.
Key Requirements
Proven experience building AI-native applications or platforms.
Strong proficiency in machine learning frameworks such as PyTorch or TensorFlow.
Expertise in Python programming and software engineering best practices.
Experience with Large Language Models (LLMs) and prompt engineering.
Demonstrated ability to build and deploy scalable AI products.
Strong problem-solving skills and a deep understanding of AI fundamentals beyond basic courses.
Ability to work independently and collaboratively in a fast-paced environment.
A portfolio of previous AI projects including GitHub repositories or live demos.
Knowledge of cloud platforms like AWS, GCP, or Azure for AI model deployment.
Continuous learner staying updated with the latest advancements in artificial intelligence.
0 Negotiable or Not Mentioned
Remote
24 days ago
aspireitc.com
1877 Views
AspireITC is seeking a highly skilled and experienced SAP Basis Administrator to join our technical team. In this role, you will be responsible for the management and maintenance of our SAP environments, with a specific focus on S/4HANA and HANA DB architectures. You will play a critical part in ensuring the stability, security, and performance of our enterprise systems, handling tasks ranging from routine monitoring to complex system transformations.
As an SAP Basis Consultant, your primary focus will involve executing system refreshes, managing comprehensive upgrades, and conducting detailed performance tuning to optimize system throughput. We are looking for a professional with a solid track record who can work independently to resolve infrastructure challenges and collaborate effectively with cross-functional teams. This is a remote opportunity for candidates with 6 to 10 years of relevant experience in the SAP ecosystem.
Key Requirements
Minimum of 6 to 10 years of professional experience in SAP Basis Administration.
Hands-on expertise in SAP S/4HANA environment management.
Deep technical knowledge of HANA Database administration and optimization.
Proven experience in executing end-to-end SAP system refreshes.
Strong background in managing complex SAP system upgrades and patches.
Demonstrated proficiency in performance tuning and system monitoring.
Experience with SAP Transport Management System (TMS) and change control.
Ability to troubleshoot and resolve complex technical issues within the SAP landscape.
Knowledge of SAP security principles, including roles and authorizations.
Strong communication skills for effective collaboration with technical and non-technical stakeholders.