0 Negotiable or Not Mentioned
USA, Malvern
28 days ago
judge.com
1370 Views
We are seeking a highly skilled AI/ML Data Scientist to join our team for a long-term engagement in Malvern, Pennsylvania. This is a hybrid role that requires the candidate to be onsite from day one to collaborate effectively with the local team. The successful applicant will be responsible for designing and implementing advanced machine learning algorithms and artificial intelligence models to extract valuable insights from complex datasets. You will work closely with cross-functional teams to identify business opportunities where AI and ML can drive significant impact and efficiency.
This contract position has an expected duration of over one year, providing a stable opportunity to contribute to high-impact projects. The selection process will include video interviews for shortlisted candidates. Applicants are required to submit a comprehensive resume along with a copy of their work authorization and a link to their professional LinkedIn profile. This role is ideal for individuals who thrive in a data-driven environment and possess a strong technical foundation in modern data science practices.
Key Requirements
Proficiency in Python, R, or similar programming languages for data analysis.
Proven experience in developing and deploying Machine Learning and AI models.
Ability to work onsite in Malvern, PA on a hybrid schedule from the first day.
Strong understanding of statistical concepts and probability theory.
Experience with data visualization tools such as Tableau, PowerBI, or Matplotlib.
Ability to clean, preprocess, and manage large-scale datasets efficiently.
A valid work authorization copy must be provided with the application.
Provision of a professional LinkedIn profile URL for background verification.
Excellent communication skills to translate technical findings to stakeholders.
Master’s or PhD degree in Computer Science, Data Science, or a related quantitative field.
0 Negotiable or Not Mentioned
USA, Malvern, PA
22 days ago
judge.com
1217 Views
This position is for an AWS Data Analytics Engineer located in Malvern, Pennsylvania. The role follows a hybrid model requiring the candidate to be onsite from day one. The initial contract length is for one year, with a strong likelihood of being extended for multiple years. The recruitment process includes a video interview, and candidates are welcome to apply via C2C arrangements.
Technical responsibilities focus on utilizing Python and SQL for complex data queries and manipulation. The successful candidate will also be responsible for creating data visualizations and dashboards using Tableau and leveraging various AWS cloud services to manage and analyze large datasets. Applicants must submit their resume along with a copy of their work authorization for consideration.
Key Requirements
Proficiency in Python programming for data engineering tasks.
Strong expertise in SQL for performing complex data queries.
Extensive experience with Tableau for data visualization and reporting.
In-depth knowledge of AWS services related to data analytics.
Ability to work onsite in Malvern, PA following a hybrid model.
Minimum of 1+ year experience in a similar data analytics role.
Experience with cloud-based data warehousing and architecture.
Strong analytical and problem-solving skills.
Ability to participate and perform well in video interviews.
Must provide valid work authorization documentation.
Experience with C2C project delivery models.
Excellent communication and collaboration skills.
0 Negotiable or Not Mentioned
USA, McLean, VA
28 days ago
S3Connections.com
1869 Views
We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize scalable data platforms that transform complex data into meaningful business insights. The ideal candidate will have strong expertise in SQL, Python, and ETL development, along with experience supporting cloud-based data migration and modern data ecosystems. You will be responsible for building and maintaining scalable ETL/data pipelines for structured and unstr
0 Negotiable or Not Mentioned
USA, Harrisburg, PA
24 days ago
dsiginc.com
1727 Views
DSIG Inc is seeking a highly skilled and experienced Senior Data Engineer to join our team for a direct client project located in Harrisburg, PA. This role is a hybrid position, requiring the candidate to be local to the area and possess a valid Pennsylvania Driver’s License. The successful candidate will be responsible for designing, building, and maintaining robust data architectures and pipelines that support large-scale data processing. Experience with the State of Pennsylvania is a mandatory requirement for this role, as the candidate will be working closely with government-related data systems and processes. In this position, you will leverage your expertise in SQL, ETL processes, and various data warehousing technologies to ensure data integrity and availability. You will also participate in face-to-face interviews and collaborate with multi-functional teams to translate business requirements into technical solutions. We are looking for a professional who is not only technically proficient but also an excellent communicator. If you have a passion for data engineering and meet the residency and licensing requirements, we encourage you to apply by sending your resume to the provided contact.
Key Requirements
Must possess a valid Pennsylvania Driver’s License.
Mandatory experience working with the State of Pennsylvania.
Candidate must be local to Harrisburg, PA for hybrid work.
Proven experience as a Senior Data Engineer or similar role.
Expertise in designing and maintaining scalable data pipelines.
Strong proficiency in SQL and relational database management.
Experience with ETL tools and data integration techniques.
Ability to attend face-to-face interviews in Harrisburg.
Experience with cloud-based data platforms (e.g., AWS, Azure, GCP).
Bachelor's degree in Computer Science, Information Technology, or related field.
~11,666 Mentioned
United States, New York
7 days ago
gmail.com
1185 Views
We are actively seeking a highly skilled Senior Data Engineer to build and scale modern data infrastructure for a fast-growing organization within the Financial Services and Data & Analytics industry. In this role, you will play a critical part in designing, developing, and optimizing data pipelines and architectures that support advanced analytics and critical business intelligence initiatives. You will be responsible for ensuring the scalability and performance of data systems while maintaining the highest standards of data quality and governance.
The ideal candidate will have extensive experience in building scalable ETL/ELT pipelines and maintaining robust data warehouses and data lakes. You will work with large-scale structured and unstructured datasets, collaborating closely with data scientists and analysts to provide the foundational data structures needed for complex modeling. The position offers a competitive package ranging from $140,000 to $200,000 annually, plus bonuses and full benefits, based in New York.
Key Requirements
5+ years of professional experience in data engineering roles.
Strong proficiency in programming languages, particularly Python.
Advanced knowledge of SQL for complex data manipulation and querying.
Hands-on experience with Apache Spark for large-scale data processing.
Extensive experience with cloud platforms such as AWS, Azure, or GCP.
Proven track record with data warehousing solutions and architecture.
Strong understanding of big data technologies and distributed systems.
Ability to design and build scalable ETL and ELT pipelines.
Proficiency in maintaining and optimizing data lakes for performance.
Excellent collaboration skills for working with data scientists and analysts.
Experience in ensuring data quality, integrity, and corporate governance.
0 Negotiable or Not Mentioned
USA, Mclean, VA
14 days ago
momentousa.com
1364 Views
Momentousa is hiring a Senior Data Engineer for an onsite W2 position located in McLean, VA. This role is ideal for a seasoned professional with deep expertise in big data technologies and data warehousing. You will be responsible for designing, building, and maintaining scalable data pipelines that process vast amounts of information to support our analytics and business intelligence initiatives.
The successful candidate will work extensively with AWS services, Spark, and Pyspark to transform raw data into actionable insights. You will leverage your advanced SQL skills and Python knowledge to model data and optimize database performance. This role demands a high level of technical proficiency in Hive and general data modeling principles to ensure our data architecture is robust, efficient, and capable of supporting complex business queries.
Key Requirements
Significant experience with AWS cloud data services
Expert-level knowledge of Spark and Pyspark for data processing
Advanced proficiency in SQL, including basic and complex query optimization
Strong backend development skills using Python
Practical experience with Hive for data warehousing and querying
Proven ability in Data Modelling and architecture design
Experience building and maintaining robust ETL pipelines
Knowledge of performance tuning for big data applications
Ability to work onsite in McLean, VA on a regular basis
Strong analytical skills to interpret complex data sets
0 Negotiable or Not Mentioned
USA, Pittsburgh
24 days ago
skilzmatrix.com
2134 Views
PNC is currently seeking a highly experienced Super Senior Data Engineer with over 10 years of professional experience to join their team in Pittsburgh, PA. The successful candidate will play a critical role in designing, building, and maintaining scalable data pipelines leveraging the full suite of AWS cloud services. This position involves developing and optimizing sophisticated ETL and ELT workflows to handle both structured and semi-structured data, ensuring that high-performance analytics are available for business decision-making. Working within an agile environment, the role demands a expert-level understanding of data processing jobs using Python and PySpark.
In addition to pipeline construction, the engineer will be responsible for integrating and managing data within the Snowflake cloud data warehouse. This includes writing complex SQL queries for data transformation and validation, as well as supporting Power BI dashboards by delivering curated, analytics-ready datasets. Candidates must demonstrate a strong commitment to data quality, governance, performance, and security best practices. This role is offered on a W2 basis and is ideal for individuals with prior experience in the financial services or banking domain who are looking to apply their technical leadership in a dynamic corporate environment.
Key Requirements
Minimum of 10 years of professional experience in Data Engineering or a related field.
Advanced proficiency in SQL, including complex querying and performance tuning.
Extensive experience designing and maintaining scalable data pipelines on AWS.
Expert knowledge of Python and PySpark for large-scale data processing.
Hands-on experience with Snowflake cloud data warehouse management and integration.
Proven ability to develop and optimize ETL/ELT workflows for various data formats.
Experience supporting Power BI through data modeling and performance optimization.
Familiarity with AWS services such as S3, Glue, EMR, Lambda, and Redshift.
Strong understanding of data quality frameworks, governance, and security best practices.
Ability to work effectively in an Agile/Scrum environment with cross-functional teams.
0 Negotiable or Not Mentioned
USA, New York
51 days ago
intellectt.com
528 Views
We are seeking a highly skilled Machine Learning/ML Engineer to join our dynamic team in New York. The successful candidate will be responsible for designing, developing, and deploying advanced machine learning models and algorithms to solve complex business challenges. This role requires a strong background in data science and software engineering, as you will be working closely with cross-functional teams to integrate intelligent solutions into
0 Negotiable or Not Mentioned
USA, Cambridge, MA
23 days ago
proclinical.com
1131 Views
Join a pioneering team as a Research Scientist specializing in AI and Machine Learning for Biologics. In this role, you will be at the forefront of transforming the future of therapeutics by applying cutting-edge machine learning techniques to real-world biological challenges. Your work will focus on pushing the boundaries of antisense oligonucleotides, antibodies, and biologics discovery, contributing to a revolutionary approach in drug development and sequence-aware modeling.
Your primary responsibilities will include building scalable and reproducible AI/ML frameworks designed to accelerate the drug discovery process from initial research to therapeutic application. You will develop predictive models specifically tailored for oligonucleotide therapeutics and contribute to next-generation antibody discovery through advanced modeling of molecular sequences, structures, and interactions. This role is based in Cambridge, MA, with a flexible remote structure that includes occasional onsite visits to facilitate collaboration and laboratory synergy.
Key Requirements
3+ years of post-PhD industry experience in a relevant scientific field.
Proven expertise in ASO, siRNA, and/or antibody design and development.
Strong foundation in modern AI/ML tools, approaches, and statistical modeling.
Proficiency in programming languages such as Python or R for data analysis.
Experience with deep learning frameworks like PyTorch, TensorFlow, or Keras.
Strong understanding of structural biology and protein-ligand interactions.
Demonstrated ability to build scalable and reproducible computational frameworks.
Excellent communication skills to present technical concepts to cross-functional teams.
Track record of scientific publications or patents in AI/ML or Biologics.
Ability to manage multiple projects in a fast-paced drug discovery environment.
0 Negotiable or Not Mentioned
USA, Stamford
30 days ago
aetalentsgroup.com
1781 Views
We are seeking a highly skilled Snowflake Developer to join our dynamic team for a contract duration of 6 or more months. This role is designed for a technical expert with a customer-focused mindset who can deliver excellent service to clients, partners, and stakeholders. You will be responsible for managing and resolving support tickets within SLA guidelines while working with critical integrations like Active Directory, LDAP, Outlook, Word, Exc
0 Negotiable or Not Mentioned
USA, New York
14 days ago
primesoftconsulting.com
916 Views
We are looking for a highly skilled Full Stack Developer to join our dynamic team in New York. The ideal candidate will be responsible for developing and maintaining both front-end and back-end applications, ensuring high performance and responsiveness to requests from the front-end. You will work closely with other developers and stakeholders to deliver high-quality software solutions that meet business requirements. The role requires a deep understanding of modern web technologies, including .NET and Python for backend services, and React or Angular for front-end development. You should be comfortable working with databases, designing schemas, and optimizing SQL queries. Additionally, experience with RESTful APIs, CI/CD pipelines, and Git is essential for this position.
Key Requirements
Proficient in Backend technologies such as .NET and Python.
Strong experience with Frontend frameworks like React and Angular.
Expertise in modern JavaScript frameworks.
Solid understanding of database systems and SQL.
Experience in schema design and performance tuning.
Proficiency in API development and REST integration patterns.
Knowledge of DevOps practices, including Git.
Experience with CI/CD pipelines.
Strong testing discipline for software quality.
Excellent problem-solving and analytical skills.
0 Negotiable or Not Mentioned
USA, New York City
14 days ago
univedgeconsulting.com
1319 Views
As a Lead Application Architect, you will be responsible for designing and maintaining robust software application architectures that meet complex business requirements. You will take a lead role in data modeling, integration, and governance efforts to ensure data quality and consistency across the enterprise. This role requires a hybrid or on-site presence in New York City, working closely with stakeholders, developers, and IT teams to ensure the successful execution of technical projects.
Your expertise will be vital in ensuring the compatibility and integration of software components and data systems. You will monitor and enhance the performance, quality, and responsiveness of applications while developing detailed architecture models and guidelines. The ideal candidate will have strong Python full-stack solution architect skills, mandatory MongoDB expertise, and preferably experience within Google Cloud Platform environments. This position also offers the opportunity to work with climate data, adding a meaningful layer to the architectural challenges.
Key Requirements
Proven experience as a Python fullstack solution architect.
Mandatory expertise in MongoDB and database technologies.
Strong knowledge of Google Cloud Platform (GCP) or similar cloud services.
Ability to lead data modelling, integration, and governance efforts.
Proficiency in modern data processing technologies and Meta-data driven modelling.
Expertise in microservices architecture and RESTful API design.
Solid understanding of front-end and back-end frameworks such as React, Angular, and Node.js.
Familiarity with CI/CD pipelines and DevOps practices.
In-depth knowledge of security best practices including IAM, encryption, and firewalls.
Experience with data orchestration, pipeline tools, and agile development methodologies.
0 Negotiable or Not Mentioned
USA, Arden
13 days ago
flexontechnologies.com
1425 Views
Flexon Technologies is currently hiring for the position of Databricks Python Engineer to join their team for a long-term project in Arden, Delaware. This is a Dayone Onsite role, requiring the candidate to be physically present at the location from the start of the engagement. The position is tailored for senior professionals with over 10 years of experience who can provide high-level consultancy and technical expertise. The focus of the role is within the Retail Digital domain, specifically integrating complex data solutions to improve business operations.
The technical requirements include a deep proficiency in Python, Databricks, and PL/SQL, alongside experience with ServiceNow and Aptos Store Inventory Management Systems. Candidates are expected to have a strong background in retail digital environments, as resumes without this domain expertise will not be considered. The role is offered at an hourly rate of 55/hr C2C. The successful candidate will be responsible for developing and managing data pipelines, optimizing store inventory systems, and ensuring seamless integration across various enterprise platforms.
Key Requirements
10+ years of hands-on experience as a technical consultant.
Deep expertise in Python programming for data engineering tasks.
Advanced proficiency with Databricks for processing large datasets.
Strong skills in writing and optimizing PL/SQL queries.
Prior experience working with ServiceNow platform.
Hands-on experience with Aptos Store Inventory Management System.
Essential domain experience in Retail Digital.
Ability to work onsite in Arden, DE from the first day of the contract.
Familiarity with Azure cloud infrastructure and services.
Understanding of big data ecosystems and Hadoop environments.
0 Negotiable or Not Mentioned
USA, New York
1 day ago
cloudrover.io
140 Views
Cloud Rover is currently seeking a highly skilled Kafka Administrator to join our technical team in New York. In this role, you will be responsible for the setup, configuration, and maintenance of Kafka clusters to ensure high availability and performance for our streaming data pipelines. You will work closely with developers and operations teams to optimize messaging throughput and resolve any performance bottlenecks within the environment. Applicants must possess valid work authorization such as USC, GC, GC EAD, H4 EAD, or OPT EAD to be considered for this position.
Your daily responsibilities will include managing ZooKeeper instances, monitoring cluster health using industry-standard tools, and implementing robust security measures such as SSL and SASL. You will also be expected to automate routine tasks using scripting languages like Python or Shell and participate in disaster recovery planning. This position offers a unique opportunity to work on large-scale distributed systems in a fast-paced environment. Please ensure your application includes your LinkedIn profile, current location, and your specific work authorization status.
Key Requirements
Proven experience in managing and scaling Apache Kafka clusters.
Deep understanding of ZooKeeper and its role in Kafka orchestration.
Experience with Kafka security features including SSL, SASL, and ACLs.
Proficiency in Linux system administration and command-line tools.
Expertise in monitoring tools like Prometheus, Grafana, or Confluent Control Center.
Ability to write automation scripts using Python, Bash, or Shell.
Knowledge of data retention policies and Kafka topic configuration.
Experience with backup, restore, and disaster recovery procedures for Kafka.
Familiarity with containerization technologies like Docker or Kubernetes.
Strong troubleshooting skills for resolving connectivity and performance issues.
Excellent communication skills for collaborating with cross-functional teams.
Valid US work authorization (USC, GC, GC EAD, H4 EAD, or OPT EAD).
0 Negotiable or Not Mentioned
USA, Newark, NJ
56 days ago
comtecinfo.com
554 Views
We are seeking a motivated Jr. Power BI Developer to join our team in Newark, NJ. In this role, you will be responsible for developing and maintaining business intelligence solutions, creating impactful reports, and designing interactive dashboards that provide actionable insights. You will collaborate with cross-functional teams to identify data sources and optimize data models to support business objectives. This is an excellent opportunity for
0 Negotiable or Not Mentioned
USA, New York
3 days ago
convextech.com
459 Views
Convex Tech Inc. is seeking a skilled Data Engineer for a hybrid role based in New York. This position requires the candidate to work onsite three days a week and participate in an onsite interview process. The successful candidate will focus on designing and implementing scalable data pipelines within the Azure ecosystem, specifically utilizing Azure Databricks and Azure Data Factory. The role involves developing robust ETL/ELT workflows using Apache Spark and PySpark DataFrames to process large datasets efficiently while ensuring optimal performance and scalability.
Beyond core pipeline development, the Data Engineer will be responsible for maintaining data governance, security, and compliance. Key tasks include implementing data quality frameworks, managing data lineage, and supporting modern Lakehouse architectures. Candidates must possess a deep understanding of SQL-based transformations and Master Data Management (MDM) concepts to ensure data consistency and integrity across the organization. This is a contract-based opportunity for 6 months or more, specifically looking for USC or GC holders ready to work in a hybrid environment.
Key Requirements
Design and implement scalable data pipelines using Azure Databricks and Azure Data Factory.
Develop and maintain robust ETL/ELT workflows using Apache Spark and PySpark DataFrames.
Build and optimize data pipelines for efficient data ingestion and processing of large datasets.
Utilize data governance tools to manage data access, security, compliance, and data lifecycle.
Implement data quality frameworks and maintain data lineage across enterprise data platforms.
Design and support modern data architecture using Lakehouse and distributed data processing.
Develop high-performance Spark and SQL-based data transformation procedures.
Apply Master Data Management (MDM) concepts to ensure data consistency and standardization.
Must be a US Citizen or Green Card holder (USC/GC only).
Willingness to work onsite in New York 3 days a week and attend an onsite interview.
0 Negotiable or Not Mentioned
USA, New York
22 days ago
fusionplusinc.com
1568 Views
The Performance Engineering team is looking for a Lead AI-Driven Performance Engineer to serve as a Senior Consultant. The primary focus of this role is the development and scaling of PerfHub, an innovative AI-enabled ecosystem designed to automate end-to-end performance and resiliency validation. This platform aims to streamline the detection of application changes, automate test selection, and handle complex execution, analysis, and documentation tasks. The successful candidate will blend deep performance engineering expertise with cutting-edge AI technologies to drive efficiency across the entire organization.
Key responsibilities include utilizing Large Language Models and OpenAI tools to automate data-driven workflows and integrate testing with monitoring and analytics. You will work closely with application and infrastructure teams to define resiliency KPIs and provide high-level insights into performance trends. This position requires a strong technical background in Python and enterprise application architectures, particularly within regulated environments like financial services. You will be instrumental in transforming traditional testing practices into highly automated, scalable, and intelligent processes.
Key Requirements
8+ years of experience in Performance Engineering or related fields.
Strong foundation in performance testing, analysis, and certification.
Experience in building or contributing to automation platforms or frameworks.
Solid understanding of enterprise application architectures (web, distributed, cloud, databases).
Proficiency in Python for automation, orchestration, and data analysis.
Familiarity with performance tools and observability platforms.
Strong analytical skills and ability to collaborate across teams in a large enterprise.
Experience applying AI/ML or LLMs to engineering or testing workflows.
Familiarity with CI/CD integration and resiliency concepts.
Ability to translate technical outcomes into executive-level insights.
Experience in regulated financial services environments.
0 Negotiable or Not Mentioned
USA, Philadelphia
24 days ago
keypixelusa.com
1635 Views
We are seeking a highly skilled and experienced Data Architect with specific expertise in the Health Care Payer Domain to join our team in Philadelphia, PA. The ideal candidate will have over 15 years of professional experience in data architecture and a deep understanding of how to manage and structure data within the healthcare insurance sector. You will be responsible for designing, creating, deploying, and managing our organization's data architecture, ensuring that it is robust, scalable, and meets the specific needs of the payer domain.
In this hybrid role, you will leverage your expertise in various database technologies and cloud platforms such as Azure, GCP, and AWS to build modern data environments. Your work will involve collaborating with cross-functional teams to integrate disparate data sources, improve data quality, and support analytical initiatives. You must be adept at translating business requirements into technical specifications and have a proven track record of delivering high-quality data solutions in complex environments.
Key Requirements
Minimum of 15 years of professional experience in Data Architecture.
Must have deep expertise specifically within the Health Care Payer Domain.
Advanced proficiency with cloud platforms including Azure, GCP, and AWS.
Strong background in both relational and non-relational database design.
Extensive experience in building and optimizing scalable data pipelines.
Mastery of various data modeling tools and architectural methodologies.
Comprehensive knowledge of healthcare data standards and security regulations.
Proven ability to lead and deliver large-scale data transformation projects.
Excellent stakeholder management and technical communication skills.
Ability to work effectively in a hybrid office environment in Philadelphia.
0 Negotiable or Not Mentioned
USA, Virginia
56 days ago
techxplorers.in
554 Views
We are seeking a highly skilled and experienced Power BI Developer to join our team in Virginia. In this hybrid role, you will be responsible for designing, developing, and maintaining comprehensive business intelligence solutions within a dynamic, data-driven environment. You will work closely with medical data, necessitating a strict adherence to HIPAA compliance standards, while developing impactful dashboards and reports that drive organizati
0 Negotiable or Not Mentioned
USA, Harrisburg
22 days ago
thoughtwavesoft.com
1408 Views
Thoughtwave Software and Solutions is currently seeking a highly skilled Power BI Developer to support the Pennsylvania Department of Community and Economic Development. This hybrid role based in Harrisburg, PA, involves a core focus on building advanced Power BI dashboards and performing complex SQL data analysis to drive business intelligence initiatives. The developer will be responsible for data modeling, integration, and ensuring that reporting and analytics meet the high standards of government operations.
The ideal candidate will possess a senior level of expertise, specifically requiring over 10 years of experience with Power BI and SQL. Key responsibilities include designing intuitive data visualizations, managing data workflows, and collaborating with stakeholders to translate technical data into actionable insights. This position offers a professional environment that balances remote work with onsite presence in Harrisburg, ensuring a collaborative and efficient approach to data engineering and analytics.
Key Requirements
Minimum of 10 years of professional experience with Power BI
Extensive background in SQL for advanced data analysis and querying
Proven expertise in data modeling and backend data integration
Strong portfolio of building and deploying complex Power BI dashboards
Experience in reporting and advanced analytics within an enterprise environment
Ability to work in a hybrid capacity located in Harrisburg, PA
Proficiency in DAX (Data Analysis Expressions) and Power Query
Strong understanding of data visualization best practices and principles
Ability to translate complex business requirements into technical solutions
Excellent communication skills for collaborating with government stakeholders
0 Negotiable or Not Mentioned
USA, Harrisburg, PA
28 days ago
zeforge.com
1718 Views
This is a long-term position for a Senior .NET Application Developer (Technical Architect) based in Harrisburg, PA, supporting the State of Pennsylvania. The role requires local Harrisburg, PA profiles as candidates should reside within driving distance of the office to report on-site if or when needed. Applicants are required to attach a copy of their Driver's License and their LinkedIn profile link when submitting their resumes for consideratio
0 Negotiable or Not Mentioned
USA, New York City
24 days ago
amaglobaltech.com
1754 Views
Ama Global Tech is seeking a skilled Data Engineer for a hybrid role located in New York City, NY. This position requires a professional who can design, build, and maintain scalable data pipelines and architectures. You will work closely with cross-functional teams to ensure data accessibility and quality, focusing on high-performance computing and cloud-based environments. The role involves a mix of remote work and onsite presence, specifically requiring local candidates capable of attending face-to-face interviews.
The ideal candidate will demonstrate mastery over the AWS ecosystem and the Databricks platform. You will be responsible for implementing data processing solutions using Spark and Python, while managing containerized applications with Docker and Kubernetes. We are looking for a proactive problem-solver who can navigate the complexities of data warehousing and data lakes to provide actionable insights for the business. A certification in Databricks Engineering is a significant plus for this position.
Key Requirements
Strong experience with AWS services including S3, Lambda, and EMR.
Proficiency in Spark and Python for complex data engineering tasks.
Solid understanding of data warehousing and data lake (DW/DH) concepts.
Hands-on experience with Docker and Kubernetes for containerized environments.
Certified Databricks Engineer is highly preferred.
Excellent troubleshooting and debugging skills to resolve technical issues.
Ability to attend a mandatory Face-to-Face (F2F) interview in New York City.
Must be a local candidate currently residing in or near New York City.
Eligible for C2C with H1 or W2 with GC/USC status.
Strong communication skills for effective team collaboration.
0 Negotiable or Not Mentioned
USA, Philadelphia, PA
16 days ago
apptadinc.com
1171 Views
Apptad Inc is seeking a highly skilled Sr. Full Stack Developer to join our team in Philadelphia, PA, in a hybrid capacity. This role is ideal for a veteran developer with over 10 years of experience looking to lead complex technical initiatives. You will be responsible for building advanced data pipelines and ETL processes using Airflow and Snowflake, while also supporting the development of sophisticated web applications using React, Material UI, and AngularJS. The position involves collaborating across multiple teams to ensure the delivery of high-quality software solutions and the resolution of intricate technical problems.
In this hybrid role, you will play a key part in implementing process improvements and driving automation across the development lifecycle. Candidates should possess a deep understanding of cloud environments, specifically AWS, and be proficient in containerization technologies such as Docker and Kubernetes. Given the nature of our projects, experience within the financial services or asset management industry is considered a significant advantage. You will also have the opportunity to utilize AI-enabled development tools like Copilot and Claude to enhance productivity and innovation within our tech stack.
Key Requirements
Minimum of 10 years of professional experience in full-stack software development.
Expert proficiency in Python and frameworks such as Django.
Extensive experience with front-end technologies including React, Angular, and Material UI.
Strong background in AWS services including EC2, S3, Lambda, SNS, and SQS.
Demonstrated expertise in containerization using Docker and orchestration with Kubernetes.
Proven experience with CI/CD tools like Jenkins, Gitlab, or Bamboo.
Deep knowledge of database systems including Snowflake, Redshift, and SQL.
Hands-on experience building and managing data pipelines with Apache Airflow.
Strong understanding of REST API design and DevOps best practices.
Familiarity with AI-enabled development tools such as Copilot or Claude.
Experience in the Financial Services or Asset Management domain is highly preferred.
Excellent collaborative skills and the ability to troubleshoot complex technical issues.
0 Negotiable or Not Mentioned
USA, Philadelphia
20 days ago
arctrs.com
1303 Views
This role is designed for Oracle leaders who excel in designing and architecting enterprise-scale financial solutions within the Oracle Fusion O2C landscape. As a Solution Architect, you will be at the forefront of business transformation, owning the end-to-end Order-to-Cash lifecycle and influencing key architectural decisions. You will work closely with stakeholders to ensure that business visions are effectively translated into technical executions that are both scalable and efficient.
The position requires a deep technical and functional understanding of Oracle Fusion Receivables and Cash Management modules. You will be responsible for leading configurations, managing complex RICE components (Reports, Integrations, Conversions, Extensions), and building powerful insights using OTBI and BI Publisher. Additionally, the role involves overseeing data migrations using FBDI and ADFDI, managing system integrations via OIC, and designing robust PaaS-based solutions to support evolving business requirements.
Key Requirements
Mastery of the Oracle Fusion O2C (Order-to-Cash) lifecycle.
Deep expertise in Oracle Fusion Receivables and Cash Management modules.
Hands-on experience with Oracle Integration Cloud (OIC) and APIs.
Proficiency in FBDI and ADFDI for legacy data conversion and migration.
Advanced skills in SQL and managing enterprise data flows.
Expertise in reporting tools including OTBI and BI Publisher.
Proven ability to design PaaS-based scalable solutions.
12 to 15 years of professional experience in Oracle Cloud implementations.
Experience leading configurations across complex financial environments.
Capability to manage and deliver RICE (Reports, Integrations, Conversions, Extensions) components.
Strong ability to translate business requirements into technical architectures.
0 Negotiable or Not Mentioned
USA, Reston, VA
55 days ago
asofttek.com
548 Views
As a Senior OpenText Documentum Developer, you will be responsible for the design, development, and troubleshooting of complex Documentum applications. You will play a critical role in leveraging modern API development and AWS cloud services to enhance enterprise content management capabilities. This role involves architectural strategy, where you will identify and implement system enhancements and manage migrations to ensure high system performa
0 Negotiable or Not Mentioned
USA, Reston
55 days ago
asofttek.com
548 Views
We are seeking a highly skilled Senior OpenText Documentum Developer to join our team in Reston, VA. In this role, you will be responsible for designing, developing, and troubleshooting complex Documentum applications while leveraging modern API development and AWS cloud services to significantly enhance our enterprise content management capabilities. Your expertise will be vital in identifying and implementing system enhancements and migrations
0 Negotiable or Not Mentioned
USA, Ramsey, New Jersey
14 days ago
btisolutions.com
842 Views
As the Senior OutSystems Platform Architect (Tech Lead), you will play a pivotal role in anchoring a growing development and operations team that is transitioning into the OutSystems ecosystem. This full-time position located in Ramsey, NJ, requires a visionary leader who can set architecture standards and define the strategic direction for platform usage. You will be the primary technical authority, responsible for owning migration strategies from legacy systems and providing expert guidance through code reviews and best practices implementation.
Your daily responsibilities will involve end-to-end governance of the OutSystems platform, covering everything from initial design patterns to complex CI/CD pipeline management. You will integrate various enterprise systems and mentor team members to foster a culture of technical excellence. The ideal candidate will bridge the gap between high-level architectural design and hands-on technical execution, ensuring the platform is robust, secure, and scalable for future growth. Benefits for this role include Medical, Dental & Vision coverage, Paid Sick Leave, and a 401(K) plan.
Key Requirements
10+ years of professional experience in software engineering.
5+ years of hands-on experience specifically on the OutSystems platform.
Active OutSystems Architecture Specialist certification or equivalent architectural experience.
Strong proficiency in SQL database management and querying.
Advanced knowledge of C# and the .NET framework.
Proficiency in modern web technologies including HTML, CSS, and JavaScript.
Based in New Jersey or Greater NYC area with the ability to commute to Ramsey, NJ.
Experience in designing and governing end-to-end platform patterns.
Proven ability to lead and mentor development and operations teams.
Familiarity with enterprise security protocols such as SSO, Active Directory, and OAuth.
Knowledge of establishing CI/CD pipelines and platform integrations.
0 Negotiable or Not Mentioned
USA, Hanover
8 days ago
cloudrover.io
820 Views
We are currently seeking a skilled Full Java Stack Developer to join our team in Hanover. The ideal candidate will be responsible for developing and maintaining high-quality software applications, working across the entire development lifecycle from front-end user interfaces to back-end logic and database management.
In this role, you will collaborate with cross-functional teams to define, design, and ship new features. You will be expected to write clean, maintainable, and efficient code while ensuring the best possible performance, quality, and responsiveness of the applications. This position requires a strong understanding of Java technologies and modern web development frameworks.
Key Requirements
Proficiency in Java and J2EE technologies.
Experience with Spring Boot and Spring Framework.
Strong knowledge of front-end technologies like HTML, CSS, and JavaScript.
Experience with database systems such as MySQL, PostgreSQL, or MongoDB.
Knowledge of RESTful API design and implementation.
Familiarity with version control systems, specifically Git.
Understanding of Agile development methodologies and Scrum.
Ability to write unit and integration tests using JUnit or similar frameworks.
Excellent problem-solving skills and attention to detail.
Strong communication skills for effective team collaboration.
0 Negotiable or Not Mentioned
USA, Richmond
17 days ago
diasoftwaresolutions.com
980 Views
DIA SOFTWARE SOLUTIONS LLC is seeking a highly experienced Systems Analyst 4 specializing in Data Analysis and Conversion for a position based in Richmond, VA. This role focuses on healthcare business systems data analysis, where the successful candidate will lead end-to-end data conversion strategies, including mapping, transformation, and validation. You will perform critical data analysis, reconciliation, and anomaly detection on large datasets using Teradata and SQL queries to ensure high data quality and accuracy across enterprise systems. The role requires a candidate who can navigate complex technical landscapes and provide actionable insights from large data volumes.
In addition to technical execution, the role involves collaborating closely with ETL teams, architects, and business stakeholders to ensure compliance with audit, regulatory, and data governance standards. You will manage mock conversions, testing cycles, and quality assessments while supporting SIT, UAT, and End-to-End testing activities. Candidates must be proficient in Agile methodologies and have a strong understanding of the healthcare IT domain. This is an excellent opportunity for a senior professional to drive data integrity and conversion excellence within a direct client environment.
Key Requirements
10+ years in Data Analysis & Data Conversion
Strong experience with ETL Design, Mapping Rules & Validation
Expertise in Teradata & SQL
Experience in Data Quality, Reconciliation & Reporting
Hands-on with Azure DevOps / Test Management Tools
Strong Agile Testing (SIT/UAT/E2E) experience
Excellent communication & stakeholder collaboration
Prior experience working within the healthcare industry
Ability to lead end-to-end data conversion strategy mapping
Proficiency in performing data reconciliation on large datasets
0 Negotiable or Not Mentioned
USA, Malvern PA
56 days ago
entech.com
554 Views
Entech is seeking a dedicated QA Automation Engineer to join our team in Malvern, PA. The successful candidate will be responsible for automating various product test cases and providing essential manual testing support to ensure high software quality. This hybrid role requires a professional who can effectively balance remote work with on-site collaboration at our Malvern facility to maintain seamless communication with the development team. The