0 Negotiable or Not Mentioned
USA, McLean, VA
28 days ago
S3Connections.com
2674 Views
We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize scalable data platforms that transform complex data into meaningful business insights. The ideal candidate will have strong expertise in SQL, Python, and ETL development, along with experience supporting cloud-based data migration and modern data ecosystems. You will be responsible for building and maintaining scalable ETL/data pipelines for structured and unstructured data while ensuring high-performance data solutions through advanced techniques. The role requires a presence onsite in McLean, VA, for five days a week to ensure close collaboration with team members and stakeholders.
The role involves collaborating with cross-functional teams to enhance data quality, accessibility, and system performance. You will implement best practices for data engineering, code quality, testing, and deployment. Additionally, the candidate will support cloud data migration initiatives, including data mapping, transformation, validation, and optimization. This position is critical for optimizing data workflows and ensuring high availability and reliability of data systems within an enterprise environment. Candidates should be prepared to create and maintain comprehensive technical documentation and data flow diagrams to support the platform's evolution.
Key Requirements
8+ years of experience as a Data Engineer
Strong expertise in SQL and Python
Hands-on experience building and maintaining ETL pipelines in enterprise environments
Experience working with large datasets and complex data architectures
Experience with cloud platforms such as AWS, Azure, or GCP
Strong understanding of data modeling, data warehousing, and data transformation techniques
Experience in data migration and integration projects
Excellent problem-solving, analytical, and communication skills
Familiarity with orchestration tools like Airflow
Experience with CI/CD tools such as GitHub or Jenkins
~11,666 Mentioned
United States, New York
7 days ago
gmail.com
1205 Views
We are actively seeking a highly skilled Senior Data Engineer to build and scale modern data infrastructure for a fast-growing organization within the Financial Services and Data & Analytics industry. In this role, you will play a critical part in designing, developing, and optimizing data pipelines and architectures that support advanced analytics and critical business intelligence initiatives. You will be responsible for ensuring the scalability and performance of data systems while maintaining the highest standards of data quality and governance.
The ideal candidate will have extensive experience in building scalable ETL/ELT pipelines and maintaining robust data warehouses and data lakes. You will work with large-scale structured and unstructured datasets, collaborating closely with data scientists and analysts to provide the foundational data structures needed for complex modeling. The position offers a competitive package ranging from $140,000 to $200,000 annually, plus bonuses and full benefits, based in New York.
Key Requirements
5+ years of professional experience in data engineering roles.
Strong proficiency in programming languages, particularly Python.
Advanced knowledge of SQL for complex data manipulation and querying.
Hands-on experience with Apache Spark for large-scale data processing.
Extensive experience with cloud platforms such as AWS, Azure, or GCP.
Proven track record with data warehousing solutions and architecture.
Strong understanding of big data technologies and distributed systems.
Ability to design and build scalable ETL and ELT pipelines.
Proficiency in maintaining and optimizing data lakes for performance.
Excellent collaboration skills for working with data scientists and analysts.
Experience in ensuring data quality, integrity, and corporate governance.
0 Negotiable or Not Mentioned
USA, Mclean, VA
15 days ago
momentousa.com
1366 Views
Momentousa is hiring a Senior Data Engineer for an onsite W2 position located in McLean, VA. This role is ideal for a seasoned professional with deep expertise in big data technologies and data warehousing. You will be responsible for designing, building, and maintaining scalable data pipelines that process vast amounts of information to support our analytics and business intelligence initiatives.
The successful candidate will work extensively with AWS services, Spark, and Pyspark to transform raw data into actionable insights. You will leverage your advanced SQL skills and Python knowledge to model data and optimize database performance. This role demands a high level of technical proficiency in Hive and general data modeling principles to ensure our data architecture is robust, efficient, and capable of supporting complex business queries.
Key Requirements
Significant experience with AWS cloud data services
Expert-level knowledge of Spark and Pyspark for data processing
Advanced proficiency in SQL, including basic and complex query optimization
Strong backend development skills using Python
Practical experience with Hive for data warehousing and querying
Proven ability in Data Modelling and architecture design
Experience building and maintaining robust ETL pipelines
Knowledge of performance tuning for big data applications
Ability to work onsite in McLean, VA on a regular basis
Strong analytical skills to interpret complex data sets
0 Negotiable or Not Mentioned
USA, Pittsburgh
24 days ago
skilzmatrix.com
2136 Views
PNC is currently seeking a highly experienced Super Senior Data Engineer with over 10 years of professional experience to join their team in Pittsburgh, PA. The successful candidate will play a critical role in designing, building, and maintaining scalable data pipelines leveraging the full suite of AWS cloud services. This position involves developing and optimizing sophisticated ETL and ELT workflows to handle both structured and semi-structured data, ensuring that high-performance analytics are available for business decision-making. Working within an agile environment, the role demands a expert-level understanding of data processing jobs using Python and PySpark.
In addition to pipeline construction, the engineer will be responsible for integrating and managing data within the Snowflake cloud data warehouse. This includes writing complex SQL queries for data transformation and validation, as well as supporting Power BI dashboards by delivering curated, analytics-ready datasets. Candidates must demonstrate a strong commitment to data quality, governance, performance, and security best practices. This role is offered on a W2 basis and is ideal for individuals with prior experience in the financial services or banking domain who are looking to apply their technical leadership in a dynamic corporate environment.
Key Requirements
Minimum of 10 years of professional experience in Data Engineering or a related field.
Advanced proficiency in SQL, including complex querying and performance tuning.
Extensive experience designing and maintaining scalable data pipelines on AWS.
Expert knowledge of Python and PySpark for large-scale data processing.
Hands-on experience with Snowflake cloud data warehouse management and integration.
Proven ability to develop and optimize ETL/ELT workflows for various data formats.
Experience supporting Power BI through data modeling and performance optimization.
Familiarity with AWS services such as S3, Glue, EMR, Lambda, and Redshift.
Strong understanding of data quality frameworks, governance, and security best practices.
Ability to work effectively in an Agile/Scrum environment with cross-functional teams.
0 Negotiable or Not Mentioned
USA, Malvern, PA
22 days ago
judge.com
1218 Views
This position is for an AWS Data Analytics Engineer located in Malvern, Pennsylvania. The role follows a hybrid model requiring the candidate to be onsite from day one. The initial contract length is for one year, with a strong likelihood of being extended for multiple years. The recruitment process includes a video interview, and candidates are welcome to apply via C2C arrangements.
Technical responsibilities focus on utilizing Python and SQL for complex data queries and manipulation. The successful candidate will also be responsible for creating data visualizations and dashboards using Tableau and leveraging various AWS cloud services to manage and analyze large datasets. Applicants must submit their resume along with a copy of their work authorization for consideration.
Key Requirements
Proficiency in Python programming for data engineering tasks.
Strong expertise in SQL for performing complex data queries.
Extensive experience with Tableau for data visualization and reporting.
In-depth knowledge of AWS services related to data analytics.
Ability to work onsite in Malvern, PA following a hybrid model.
Minimum of 1+ year experience in a similar data analytics role.
Experience with cloud-based data warehousing and architecture.
Strong analytical and problem-solving skills.
Ability to participate and perform well in video interviews.
Must provide valid work authorization documentation.
Experience with C2C project delivery models.
Excellent communication and collaboration skills.
0 Negotiable or Not Mentioned
USA, Philadelphia
24 days ago
keypixelusa.com
1595 Views
We are seeking a highly skilled and experienced Data Architect with specific expertise in the Health Care Payer Domain to join our team in Philadelphia, PA. The ideal candidate will have over 15 years of professional experience in data architecture and a deep understanding of how to manage and structure data within the healthcare insurance sector. You will be responsible for designing, creating, deploying, and managing our organization's data architecture, ensuring that it is robust, scalable, and meets the specific needs of the payer domain.
In this hybrid role, you will leverage your expertise in various database technologies and cloud platforms such as Azure, GCP, and AWS to build modern data environments. Your work will involve collaborating with cross-functional teams to integrate disparate data sources, improve data quality, and support analytical initiatives. You must be adept at translating business requirements into technical specifications and have a proven track record of delivering high-quality data solutions in complex environments.
Key Requirements
Minimum of 15 years of professional experience in Data Architecture.
Must have deep expertise specifically within the Health Care Payer Domain.
Advanced proficiency with cloud platforms including Azure, GCP, and AWS.
Strong background in both relational and non-relational database design.
Extensive experience in building and optimizing scalable data pipelines.
Mastery of various data modeling tools and architectural methodologies.
Comprehensive knowledge of healthcare data standards and security regulations.
Proven ability to lead and deliver large-scale data transformation projects.
Excellent stakeholder management and technical communication skills.
Ability to work effectively in a hybrid office environment in Philadelphia.
0 Negotiable or Not Mentioned
USA, Harrisburg, PA
24 days ago
dsiginc.com
1728 Views
DSIG Inc is seeking a highly skilled and experienced Senior Data Engineer to join our team for a direct client project located in Harrisburg, PA. This role is a hybrid position, requiring the candidate to be local to the area and possess a valid Pennsylvania Driver’s License. The successful candidate will be responsible for designing, building, and maintaining robust data architectures and pipelines that support large-scale data processing. Experience with the State of Pennsylvania is a mandatory requirement for this role, as the candidate will be working closely with government-related data systems and processes. In this position, you will leverage your expertise in SQL, ETL processes, and various data warehousing technologies to ensure data integrity and availability. You will also participate in face-to-face interviews and collaborate with multi-functional teams to translate business requirements into technical solutions. We are looking for a professional who is not only technically proficient but also an excellent communicator. If you have a passion for data engineering and meet the residency and licensing requirements, we encourage you to apply by sending your resume to the provided contact.
Key Requirements
Must possess a valid Pennsylvania Driver’s License.
Mandatory experience working with the State of Pennsylvania.
Candidate must be local to Harrisburg, PA for hybrid work.
Proven experience as a Senior Data Engineer or similar role.
Expertise in designing and maintaining scalable data pipelines.
Strong proficiency in SQL and relational database management.
Experience with ETL tools and data integration techniques.
Ability to attend face-to-face interviews in Harrisburg.
Experience with cloud-based data platforms (e.g., AWS, Azure, GCP).
Bachelor's degree in Computer Science, Information Technology, or related field.
0 Negotiable or Not Mentioned
USA, Hamilton Township, NJ
58 days ago
alltechconsultinginc.com
563 Views
We are seeking a senior Azure Data Engineer to join our team and support a mission-critical financial data platform. This is a long-term contract position requiring 4 days onsite per week in Hamilton Township, NJ. The primary focus of this role is to architect, build, and maintain highly secure, scalable, and audit-ready data pipelines utilizing core Azure technologies.
Key responsibilities involve the development of robust ETL/ELT processes usi
0 Negotiable or Not Mentioned
USA, New York City
24 days ago
amaglobaltech.com
1755 Views
Ama Global Tech is seeking a skilled Data Engineer for a hybrid role located in New York City, NY. This position requires a professional who can design, build, and maintain scalable data pipelines and architectures. You will work closely with cross-functional teams to ensure data accessibility and quality, focusing on high-performance computing and cloud-based environments. The role involves a mix of remote work and onsite presence, specifically requiring local candidates capable of attending face-to-face interviews.
The ideal candidate will demonstrate mastery over the AWS ecosystem and the Databricks platform. You will be responsible for implementing data processing solutions using Spark and Python, while managing containerized applications with Docker and Kubernetes. We are looking for a proactive problem-solver who can navigate the complexities of data warehousing and data lakes to provide actionable insights for the business. A certification in Databricks Engineering is a significant plus for this position.
Key Requirements
Strong experience with AWS services including S3, Lambda, and EMR.
Proficiency in Spark and Python for complex data engineering tasks.
Solid understanding of data warehousing and data lake (DW/DH) concepts.
Hands-on experience with Docker and Kubernetes for containerized environments.
Certified Databricks Engineer is highly preferred.
Excellent troubleshooting and debugging skills to resolve technical issues.
Ability to attend a mandatory Face-to-Face (F2F) interview in New York City.
Must be a local candidate currently residing in or near New York City.
Eligible for C2C with H1 or W2 with GC/USC status.
Strong communication skills for effective team collaboration.
0 Negotiable or Not Mentioned
USA, New York
3 days ago
convextech.com
505 Views
Convex Tech Inc. is seeking a skilled Data Engineer for a hybrid role based in New York. This position requires the candidate to work onsite three days a week and participate in an onsite interview process. The successful candidate will focus on designing and implementing scalable data pipelines within the Azure ecosystem, specifically utilizing Azure Databricks and Azure Data Factory. The role involves developing robust ETL/ELT workflows using Apache Spark and PySpark DataFrames to process large datasets efficiently while ensuring optimal performance and scalability.
Beyond core pipeline development, the Data Engineer will be responsible for maintaining data governance, security, and compliance. Key tasks include implementing data quality frameworks, managing data lineage, and supporting modern Lakehouse architectures. Candidates must possess a deep understanding of SQL-based transformations and Master Data Management (MDM) concepts to ensure data consistency and integrity across the organization. This is a contract-based opportunity for 6 months or more, specifically looking for USC or GC holders ready to work in a hybrid environment.
Key Requirements
Design and implement scalable data pipelines using Azure Databricks and Azure Data Factory.
Develop and maintain robust ETL/ELT workflows using Apache Spark and PySpark DataFrames.
Build and optimize data pipelines for efficient data ingestion and processing of large datasets.
Utilize data governance tools to manage data access, security, compliance, and data lifecycle.
Implement data quality frameworks and maintain data lineage across enterprise data platforms.
Design and support modern data architecture using Lakehouse and distributed data processing.
Develop high-performance Spark and SQL-based data transformation procedures.
Apply Master Data Management (MDM) concepts to ensure data consistency and standardization.
Must be a US Citizen or Green Card holder (USC/GC only).
Willingness to work onsite in New York 3 days a week and attend an onsite interview.
0 Negotiable or Not Mentioned
USA, New York City
14 days ago
univedgeconsulting.com
1321 Views
As a Lead Application Architect, you will be responsible for designing and maintaining robust software application architectures that meet complex business requirements. You will take a lead role in data modeling, integration, and governance efforts to ensure data quality and consistency across the enterprise. This role requires a hybrid or on-site presence in New York City, working closely with stakeholders, developers, and IT teams to ensure the successful execution of technical projects.
Your expertise will be vital in ensuring the compatibility and integration of software components and data systems. You will monitor and enhance the performance, quality, and responsiveness of applications while developing detailed architecture models and guidelines. The ideal candidate will have strong Python full-stack solution architect skills, mandatory MongoDB expertise, and preferably experience within Google Cloud Platform environments. This position also offers the opportunity to work with climate data, adding a meaningful layer to the architectural challenges.
Key Requirements
Proven experience as a Python fullstack solution architect.
Mandatory expertise in MongoDB and database technologies.
Strong knowledge of Google Cloud Platform (GCP) or similar cloud services.
Ability to lead data modelling, integration, and governance efforts.
Proficiency in modern data processing technologies and Meta-data driven modelling.
Expertise in microservices architecture and RESTful API design.
Solid understanding of front-end and back-end frameworks such as React, Angular, and Node.js.
Familiarity with CI/CD pipelines and DevOps practices.
In-depth knowledge of security best practices including IAM, encryption, and firewalls.
Experience with data orchestration, pipeline tools, and agile development methodologies.
0 Negotiable or Not Mentioned
USA, Stamford
30 days ago
aetalentsgroup.com
1829 Views
We are seeking a highly skilled Snowflake Developer to join our dynamic team for a contract duration of 6 or more months. This role is designed for a technical expert with a customer-focused mindset who can deliver excellent service to clients, partners, and stakeholders. You will be responsible for managing and resolving support tickets within SLA guidelines while working with critical integrations like Active Directory, LDAP, Outlook, Word, Excel, and Salesforce. The position requires a candidate who can handle customer calls professionally and track issue resolutions effectively to ensure high levels of client satisfaction. In addition to development tasks, you will support software licensing and installations, perform routine server installations, and conduct necessary maintenance. Maintaining accurate documentation of all customer interactions is a key part of the role, as is the ability to troubleshoot and escalate complex technical issues to the appropriate channels. The environment is fast-paced, demanding a high attention to detail and the ability to multitask across various web-based technologies and enterprise tools. This opportunity allows for work based in Stamford or remotely, providing flexibility for the right candidate with the necessary experience and skills.
Key Requirements
Minimum of 8 years of professional experience in technical development roles.
Proven expertise as a Snowflake Developer with deep platform knowledge.
Strong troubleshooting and analytical skills to resolve complex technical issues.
Excellent verbal and written communication skills for stakeholder interaction.
Ability to multitask and maintain organization in a fast-paced environment.
High attention to detail regarding technical documentation and ticket tracking.
Customer-centric mindset with a proactive problem-solving attitude.
Familiarity with web-based technologies and standard enterprise software tools.
Hands-on experience with Active Directory and LDAP integrations.
Proficiency in Microsoft Office Suite including Word, Excel, and Outlook.
Ability to manage software licensing and perform server maintenance tasks.
Experience working with Salesforce and other CRM integrations.
0 Negotiable or Not Mentioned
USA, Arden
13 days ago
flexontechnologies.com
1427 Views
Flexon Technologies is currently hiring for the position of Databricks Python Engineer to join their team for a long-term project in Arden, Delaware. This is a Dayone Onsite role, requiring the candidate to be physically present at the location from the start of the engagement. The position is tailored for senior professionals with over 10 years of experience who can provide high-level consultancy and technical expertise. The focus of the role is within the Retail Digital domain, specifically integrating complex data solutions to improve business operations.
The technical requirements include a deep proficiency in Python, Databricks, and PL/SQL, alongside experience with ServiceNow and Aptos Store Inventory Management Systems. Candidates are expected to have a strong background in retail digital environments, as resumes without this domain expertise will not be considered. The role is offered at an hourly rate of 55/hr C2C. The successful candidate will be responsible for developing and managing data pipelines, optimizing store inventory systems, and ensuring seamless integration across various enterprise platforms.
Key Requirements
10+ years of hands-on experience as a technical consultant.
Deep expertise in Python programming for data engineering tasks.
Advanced proficiency with Databricks for processing large datasets.
Strong skills in writing and optimizing PL/SQL queries.
Prior experience working with ServiceNow platform.
Hands-on experience with Aptos Store Inventory Management System.
Essential domain experience in Retail Digital.
Ability to work onsite in Arden, DE from the first day of the contract.
Familiarity with Azure cloud infrastructure and services.
Understanding of big data ecosystems and Hadoop environments.
0 Negotiable or Not Mentioned
USA, Newark, NJ
56 days ago
comtecinfo.com
554 Views
We are seeking a motivated Jr. Power BI Developer to join our team in Newark, NJ. In this role, you will be responsible for developing and maintaining business intelligence solutions, creating impactful reports, and designing interactive dashboards that provide actionable insights. You will collaborate with cross-functional teams to identify data sources and optimize data models to support business objectives. This is an excellent opportunity for
0 Negotiable or Not Mentioned
USA, Malvern
29 days ago
judge.com
1313 Views
We are seeking a highly skilled AI/ML Data Scientist to join our team for a long-term engagement in Malvern, Pennsylvania. This is a hybrid role that requires the candidate to be onsite from day one to collaborate effectively with the local team. The successful applicant will be responsible for designing and implementing advanced machine learning algorithms and artificial intelligence models to extract valuable insights from complex datasets. You will work closely with cross-functional teams to identify business opportunities where AI and ML can drive significant impact and efficiency.
This contract position has an expected duration of over one year, providing a stable opportunity to contribute to high-impact projects. The selection process will include video interviews for shortlisted candidates. Applicants are required to submit a comprehensive resume along with a copy of their work authorization and a link to their professional LinkedIn profile. This role is ideal for individuals who thrive in a data-driven environment and possess a strong technical foundation in modern data science practices.
Key Requirements
Proficiency in Python, R, or similar programming languages for data analysis.
Proven experience in developing and deploying Machine Learning and AI models.
Ability to work onsite in Malvern, PA on a hybrid schedule from the first day.
Strong understanding of statistical concepts and probability theory.
Experience with data visualization tools such as Tableau, PowerBI, or Matplotlib.
Ability to clean, preprocess, and manage large-scale datasets efficiently.
A valid work authorization copy must be provided with the application.
Provision of a professional LinkedIn profile URL for background verification.
Excellent communication skills to translate technical findings to stakeholders.
Master’s or PhD degree in Computer Science, Data Science, or a related quantitative field.
0 Negotiable or Not Mentioned
USA, Virginia
57 days ago
techxplorers.in
554 Views
We are seeking a highly skilled and experienced Power BI Developer to join our team in Virginia. In this hybrid role, you will be responsible for designing, developing, and maintaining comprehensive business intelligence solutions within a dynamic, data-driven environment. You will work closely with medical data, necessitating a strict adherence to HIPAA compliance standards, while developing impactful dashboards and reports that drive organizati
0 Negotiable or Not Mentioned
USA, Richmond
17 days ago
diasoftwaresolutions.com
982 Views
DIA SOFTWARE SOLUTIONS LLC is seeking a highly experienced Systems Analyst 4 specializing in Data Analysis and Conversion for a position based in Richmond, VA. This role focuses on healthcare business systems data analysis, where the successful candidate will lead end-to-end data conversion strategies, including mapping, transformation, and validation. You will perform critical data analysis, reconciliation, and anomaly detection on large datasets using Teradata and SQL queries to ensure high data quality and accuracy across enterprise systems. The role requires a candidate who can navigate complex technical landscapes and provide actionable insights from large data volumes.
In addition to technical execution, the role involves collaborating closely with ETL teams, architects, and business stakeholders to ensure compliance with audit, regulatory, and data governance standards. You will manage mock conversions, testing cycles, and quality assessments while supporting SIT, UAT, and End-to-End testing activities. Candidates must be proficient in Agile methodologies and have a strong understanding of the healthcare IT domain. This is an excellent opportunity for a senior professional to drive data integrity and conversion excellence within a direct client environment.
Key Requirements
10+ years in Data Analysis & Data Conversion
Strong experience with ETL Design, Mapping Rules & Validation
Expertise in Teradata & SQL
Experience in Data Quality, Reconciliation & Reporting
Hands-on with Azure DevOps / Test Management Tools
Strong Agile Testing (SIT/UAT/E2E) experience
Excellent communication & stakeholder collaboration
Prior experience working within the healthcare industry
Ability to lead end-to-end data conversion strategy mapping
Proficiency in performing data reconciliation on large datasets
0 Negotiable or Not Mentioned
USA, Philadelphia, PA
16 days ago
apptadinc.com
1172 Views
Apptad Inc is seeking a highly skilled Sr. Full Stack Developer to join our team in Philadelphia, PA, in a hybrid capacity. This role is ideal for a veteran developer with over 10 years of experience looking to lead complex technical initiatives. You will be responsible for building advanced data pipelines and ETL processes using Airflow and Snowflake, while also supporting the development of sophisticated web applications using React, Material UI, and AngularJS. The position involves collaborating across multiple teams to ensure the delivery of high-quality software solutions and the resolution of intricate technical problems.
In this hybrid role, you will play a key part in implementing process improvements and driving automation across the development lifecycle. Candidates should possess a deep understanding of cloud environments, specifically AWS, and be proficient in containerization technologies such as Docker and Kubernetes. Given the nature of our projects, experience within the financial services or asset management industry is considered a significant advantage. You will also have the opportunity to utilize AI-enabled development tools like Copilot and Claude to enhance productivity and innovation within our tech stack.
Key Requirements
Minimum of 10 years of professional experience in full-stack software development.
Expert proficiency in Python and frameworks such as Django.
Extensive experience with front-end technologies including React, Angular, and Material UI.
Strong background in AWS services including EC2, S3, Lambda, SNS, and SQS.
Demonstrated expertise in containerization using Docker and orchestration with Kubernetes.
Proven experience with CI/CD tools like Jenkins, Gitlab, or Bamboo.
Deep knowledge of database systems including Snowflake, Redshift, and SQL.
Hands-on experience building and managing data pipelines with Apache Airflow.
Strong understanding of REST API design and DevOps best practices.
Familiarity with AI-enabled development tools such as Copilot or Claude.
Experience in the Financial Services or Asset Management domain is highly preferred.
Excellent collaborative skills and the ability to troubleshoot complex technical issues.
0 Negotiable or Not Mentioned
USA, Malvern PA
57 days ago
entech.com
554 Views
Entech is seeking a dedicated QA Automation Engineer to join our team in Malvern, PA. The successful candidate will be responsible for automating various product test cases and providing essential manual testing support to ensure high software quality. This hybrid role requires a professional who can effectively balance remote work with on-site collaboration at our Malvern facility to maintain seamless communication with the development team. The
0 Negotiable or Not Mentioned
USA, Reston, VA
55 days ago
asofttek.com
549 Views
As a Senior OpenText Documentum Developer, you will be responsible for the design, development, and troubleshooting of complex Documentum applications. You will play a critical role in leveraging modern API development and AWS cloud services to enhance enterprise content management capabilities. This role involves architectural strategy, where you will identify and implement system enhancements and manage migrations to ensure high system performa
0 Negotiable or Not Mentioned
USA, Reston
55 days ago
asofttek.com
549 Views
We are seeking a highly skilled Senior OpenText Documentum Developer to join our team in Reston, VA. In this role, you will be responsible for designing, developing, and troubleshooting complex Documentum applications while leveraging modern API development and AWS cloud services to significantly enhance our enterprise content management capabilities. Your expertise will be vital in identifying and implementing system enhancements and migrations
0 Negotiable or Not Mentioned
USA, New York
14 days ago
primesoftconsulting.com
918 Views
We are looking for a highly skilled Full Stack Developer to join our dynamic team in New York. The ideal candidate will be responsible for developing and maintaining both front-end and back-end applications, ensuring high performance and responsiveness to requests from the front-end. You will work closely with other developers and stakeholders to deliver high-quality software solutions that meet business requirements. The role requires a deep understanding of modern web technologies, including .NET and Python for backend services, and React or Angular for front-end development. You should be comfortable working with databases, designing schemas, and optimizing SQL queries. Additionally, experience with RESTful APIs, CI/CD pipelines, and Git is essential for this position.
Key Requirements
Proficient in Backend technologies such as .NET and Python.
Strong experience with Frontend frameworks like React and Angular.
Expertise in modern JavaScript frameworks.
Solid understanding of database systems and SQL.
Experience in schema design and performance tuning.
Proficiency in API development and REST integration patterns.
Knowledge of DevOps practices, including Git.
Experience with CI/CD pipelines.
Strong testing discipline for software quality.
Excellent problem-solving and analytical skills.
0 Negotiable or Not Mentioned
USA, Raleigh
24 days ago
rgtalent.com
1366 Views
RgTalent Inc is seeking a highly skilled Senior Oracle DBA with Exadata expertise for a long-term contract role based in Raleigh, NC. This position operates on a hybrid model and requires the candidate to be local to the area. The successful candidate will be responsible for managing and maintaining complex Oracle database environments across development, test, and production stages, primarily running on Exadata and Linux within an AWS framework.
Key responsibilities include collaborating with application developers to design schemas, optimizing SQL queries, and leading modernization efforts for database scalability. The role also involves high-level technical support, implementation of disaster recovery using Oracle Data Guard, and managing replication via Oracle GoldenGate. Candidates must have extensive experience in database performance tuning and resource management for high-volume workloads to ensure system availability and efficiency.
Key Requirements
Manage and maintain Oracle database environments on Exadata and Linux platforms.
Deep expertise in managing databases within AWS cloud environments.
Collaborate with application developers to define data requirements and design tables.
Assist with complex SQL development and performance optimization.
Lead the design, implementation, and management of application schemas and database structures.
Provide technical support for Oracle databases, including diagnosis of performance issues.
Develop and maintain procedures and best practices for database operations in the cloud.
Administer high-availability and disaster recovery configurations using Oracle Data Guard.
Implement and manage replication solutions such as Oracle GoldenGate.
Perform regular database backups, restores, patching, and upgrades.
Must be local to Raleigh, NC for a hybrid onsite role.
Possess valid work authorization (USC-GC-H4EAD-GCEAD-TN-OPTEAD).
0 Negotiable or Not Mentioned
USA, New York
51 days ago
intellectt.com
528 Views
We are seeking a highly skilled Machine Learning/ML Engineer to join our dynamic team in New York. The successful candidate will be responsible for designing, developing, and deploying advanced machine learning models and algorithms to solve complex business challenges. This role requires a strong background in data science and software engineering, as you will be working closely with cross-functional teams to integrate intelligent solutions into
0 Negotiable or Not Mentioned
USA, New York
1 day ago
cloudrover.io
141 Views
Cloud Rover is currently seeking a highly skilled Kafka Administrator to join our technical team in New York. In this role, you will be responsible for the setup, configuration, and maintenance of Kafka clusters to ensure high availability and performance for our streaming data pipelines. You will work closely with developers and operations teams to optimize messaging throughput and resolve any performance bottlenecks within the environment. Applicants must possess valid work authorization such as USC, GC, GC EAD, H4 EAD, or OPT EAD to be considered for this position.
Your daily responsibilities will include managing ZooKeeper instances, monitoring cluster health using industry-standard tools, and implementing robust security measures such as SSL and SASL. You will also be expected to automate routine tasks using scripting languages like Python or Shell and participate in disaster recovery planning. This position offers a unique opportunity to work on large-scale distributed systems in a fast-paced environment. Please ensure your application includes your LinkedIn profile, current location, and your specific work authorization status.
Key Requirements
Proven experience in managing and scaling Apache Kafka clusters.
Deep understanding of ZooKeeper and its role in Kafka orchestration.
Experience with Kafka security features including SSL, SASL, and ACLs.
Proficiency in Linux system administration and command-line tools.
Expertise in monitoring tools like Prometheus, Grafana, or Confluent Control Center.
Ability to write automation scripts using Python, Bash, or Shell.
Knowledge of data retention policies and Kafka topic configuration.
Experience with backup, restore, and disaster recovery procedures for Kafka.
Familiarity with containerization technologies like Docker or Kubernetes.
Strong troubleshooting skills for resolving connectivity and performance issues.
Excellent communication skills for collaborating with cross-functional teams.
Valid US work authorization (USC, GC, GC EAD, H4 EAD, or OPT EAD).
0 Negotiable or Not Mentioned
USA, Westlake
24 days ago
hclglobal.com
1154 Views
HCL Global is looking for a talented SAP Business Objects (BI Publisher) professional to join our team in a hybrid capacity based in Westlake, TX. This role focuses on leveraging SAP Business Objects to create robust Universe and Semantic Layers that drive data-driven decision-making. You will be responsible for building scalable analytics and reporting solutions that meet complex business requirements, ensuring high performance and data accuracy across all platforms.
The ideal candidate will have extensive experience with Snowflake and advanced SQL, specializing in performance tuning and query optimization. You will work closely with stakeholders to design data models for analytics and visualization, with a strong preference for candidates who have hands-on experience with Tableau. This is an excellent opportunity to work in a dynamic environment on cutting-edge data projects while utilizing your expertise in the SAP ecosystem.
Key Requirements
Strong experience with SAP Business Objects (Universe / Semantic Layer).
Expertise in data modeling for analytics and visualization.
Advanced SQL skills with hands-on Snowflake experience.
Experience in SQL performance tuning and query optimization.
Exposure to visualization tools (Tableau highly preferred).
Ability to build scalable analytics and reporting solutions.
Proficiency in BI Publisher for complex reporting requirements.
Excellent communication skills for cross-functional collaboration.
Strong analytical and problem-solving abilities.
Bachelor's degree in Computer Science, Data Science, or related field.
0 Negotiable or Not Mentioned
USA, Harrisburg
22 days ago
thoughtwavesoft.com
1409 Views
Thoughtwave Software and Solutions is currently seeking a highly skilled Power BI Developer to support the Pennsylvania Department of Community and Economic Development. This hybrid role based in Harrisburg, PA, involves a core focus on building advanced Power BI dashboards and performing complex SQL data analysis to drive business intelligence initiatives. The developer will be responsible for data modeling, integration, and ensuring that reporting and analytics meet the high standards of government operations.
The ideal candidate will possess a senior level of expertise, specifically requiring over 10 years of experience with Power BI and SQL. Key responsibilities include designing intuitive data visualizations, managing data workflows, and collaborating with stakeholders to translate technical data into actionable insights. This position offers a professional environment that balances remote work with onsite presence in Harrisburg, ensuring a collaborative and efficient approach to data engineering and analytics.
Key Requirements
Minimum of 10 years of professional experience with Power BI
Extensive background in SQL for advanced data analysis and querying
Proven expertise in data modeling and backend data integration
Strong portfolio of building and deploying complex Power BI dashboards
Experience in reporting and advanced analytics within an enterprise environment
Ability to work in a hybrid capacity located in Harrisburg, PA
Proficiency in DAX (Data Analysis Expressions) and Power Query
Strong understanding of data visualization best practices and principles
Ability to translate complex business requirements into technical solutions
Excellent communication skills for collaborating with government stakeholders
0 Negotiable or Not Mentioned
USA, Harrisburg, PA
28 days ago
zeforge.com
2011 Views
This is a long-term position for a Senior .NET Application Developer (Technical Architect) based in Harrisburg, PA, supporting the State of Pennsylvania. The role requires local Harrisburg, PA profiles as candidates should reside within driving distance of the office to report on-site if or when needed. Applicants are required to attach a copy of their Driver's License and their LinkedIn profile link when submitting their resumes for consideration. The developer will participate in the full software development lifecycle, including requirements analysis, design, development, testing, and deployment within an enterprise environment.The technical environment for this role is robust, involving .NET/C#, Angular, TypeScript, and Entity Framework Core. The candidate will work extensively with REST APIs, Microsoft Azure, and Azure DevOps for CI/CD pipelines. Database management will involve Oracle, Azure SQL, and PostgreSQL databases, while integration tasks will include working with SAP BusinessObjects and enterprise data warehouse platforms. Proficiency in automation using Python or PowerShell is highly desired for this technical leadership role.
Key Requirements
10 or more years of professional IT experience in enterprise application development.
7 or more years of experience developing applications using .NET technologies such as C#, ASP.NET, and Web APIs.
Strong experience developing modern web applications using Angular or similar frameworks.
Extensive experience designing and optimizing solutions using relational databases such as Oracle, SQL Server, or Azure SQL.
Proven experience developing applications within Microsoft Azure environments.
Experience implementing source control, automated builds, and CI/CD pipelines using Azure DevOps.
Hands-on experience participating in the full software development lifecycle (SDLC).
Experience developing accessible web applications in compliance with WCAG guidelines.
Strong written and verbal communication skills for technical leadership and collaboration.
Experience developing automation or scripting solutions using Python or PowerShell.
Ability to integrate applications with business intelligence platforms such as SAP BusinessObjects.
Experience mentoring developers and providing technical leadership within software development teams.
0 Negotiable or Not Mentioned
USA, New York
19 days ago
techstargroup.com
921 Views
The FedRAMP Azure Architect is a high-level technical role responsible for the design, security, and governance of Microsoft Azure and Azure Government environments. The primary focus of this position is to ensure that cloud platforms meet the stringent FedRAMP Low, Moderate, and High compliance requirements necessary for federal operations. The architect will lead the strategy for cloud infrastructure, focusing on scalability and mission-readiness for U.S. federal workloads.
Key responsibilities include leading the Authorization to Operate (ATO) lifecycle activities and the rigorous implementation of security controls across cloud architectures. Candidates must possess deep expertise in Azure services and federal security standards. The role demands the ability to govern complex cloud environments while ensuring they remain compliant and performant in a highly regulated landscape. This is an onsite position based in New York, specifically seeking professionals capable of managing large-scale federal cloud projects.
Key Requirements
Must be a U.S. Citizen (no other visa types accepted).
Proven experience designing and securing Microsoft Azure environments.
Deep knowledge of Azure Government platforms.
Experience meeting FedRAMP Low, Moderate, and High compliance requirements.
Ability to lead Authorization to Operate (ATO) lifecycle activities.
Expertise in implementing complex security controls in a cloud environment.
Strong background in cloud architecture and governance.
Experience managing U.S. federal cloud workloads.
Excellent communication skills for cross-functional stakeholder management.
Ability to work onsite in New York.
0 Negotiable or Not Mentioned
USA, New York
17 days ago
gvrinfotek.com
1115 Views
GVR Infotek is currently seeking a highly experienced Staff Full Stack Engineer to join our team in New York (Manhattan) on an onsite basis. This role is designed for a seasoned professional with over 20 years of industry experience, specifically focusing on complex Event-Driven Architectures. You will be instrumental in designing, developing, and maintaining sophisticated systems that leverage real-time data and distributed computing patterns to meet high-performance requirements.
The successful candidate will lead technical initiatives across both frontend and backend domains, utilizing a modern stack including React, Next.js, Node.js, and NestJS. You will be responsible for orchestrating microservices and managing robust messaging systems such as Kafka and Azure Service Bus within a Microsoft Azure environment. Your expertise in CI/CD, Docker, and various database technologies (SQL, NoSQL) will be critical in ensuring the scalability and reliability of our platform while providing technical leadership to the engineering team.
Key Requirements
Minimum of 20+ years of professional experience in software engineering.
Expert-level knowledge of Event-Driven Architecture and real-time data processing.
Strong proficiency in frontend technologies including React, Next.js, TypeScript, and TailwindCSS.
Extensive experience in backend development using Node.js and NestJS.
Proven track record of building and managing Microservices and RESTful APIs.
Hands-on experience with messaging platforms like Kafka, Azure Service Bus, Event Grid, or BullMQ.
Deep expertise in cloud platforms, specifically Microsoft Azure.
Proficiency in containerization technologies such as Docker and implementing CI/CD pipelines.
Solid understanding of data management with SQL, NoSQL, and TypeORM.
Strong technical leadership skills with experience in distributed systems and architectural design.