0 Negotiable or Not Mentioned
USA, Reston, VA
55 days ago
asofttek.com
2193 Views
As a Senior OpenText Documentum Developer, you will be responsible for the design, development, and troubleshooting of complex Documentum applications. You will play a critical role in leveraging modern API development and AWS cloud services to enhance enterprise content management capabilities. This role involves architectural strategy, where you will identify and implement system enhancements and manage migrations to ensure high system performance and reliability.
The ideal candidate brings over a decade of experience in Documentum technologies and extensive knowledge of backend development using Java and Python. You will work within an Agile Scrum environment, utilizing tools like Jenkins, Git, and Jira to manage deployments and development lifecycles. This full-time, on-site position in Reston, VA, is also open for C2C candidates looking for a challenging role in high-level cloud-integrated ECM systems. Your expertise in AWS backend services like EC2, ECS, and Lambda will be vital for modernizing the infrastructure.
Key Requirements
Minimum of 10+ years in Documentum development and troubleshooting.
Deep knowledge of Content Server, D2 (Config/Smart View), xPlore, and DFC.
7+ years of experience with Java and Python programming languages.
5+ years of hands-on AWS experience including EC2, ECS, RDS, and Lambda.
Strong experience building REST APIs using the Spring Framework.
Ability to identify and implement system enhancements and complex migrations.
Solid understanding of software design patterns and architecture.
Proficiency with development tools like Git, Jenkins, Jira, and Tomcat.
Experience working with Relational Databases and SQL queries.
Comfortable working in an Agile/Scrum environment with strong self-management skills.
0 Negotiable or Not Mentioned
USA, New York City
24 days ago
amaglobaltech.com
1713 Views
Ama Global Tech is seeking a skilled Data Engineer for a hybrid role located in New York City, NY. This position requires a professional who can design, build, and maintain scalable data pipelines and architectures. You will work closely with cross-functional teams to ensure data accessibility and quality, focusing on high-performance computing and cloud-based environments. The role involves a mix of remote work and onsite presence, specifically requiring local candidates capable of attending face-to-face interviews.
The ideal candidate will demonstrate mastery over the AWS ecosystem and the Databricks platform. You will be responsible for implementing data processing solutions using Spark and Python, while managing containerized applications with Docker and Kubernetes. We are looking for a proactive problem-solver who can navigate the complexities of data warehousing and data lakes to provide actionable insights for the business. A certification in Databricks Engineering is a significant plus for this position.
Key Requirements
Strong experience with AWS services including S3, Lambda, and EMR.
Proficiency in Spark and Python for complex data engineering tasks.
Solid understanding of data warehousing and data lake (DW/DH) concepts.
Hands-on experience with Docker and Kubernetes for containerized environments.
Certified Databricks Engineer is highly preferred.
Excellent troubleshooting and debugging skills to resolve technical issues.
Ability to attend a mandatory Face-to-Face (F2F) interview in New York City.
Must be a local candidate currently residing in or near New York City.
Eligible for C2C with H1 or W2 with GC/USC status.
Strong communication skills for effective team collaboration.
0 Negotiable or Not Mentioned
USA, McLean
20 days ago
chasenextgen.com
1427 Views
Chase NextGen is seeking a highly skilled Java with AWS Developer for an urgent role based in McLean, Virginia. This position is onsite and requires a candidate with a deep technical background and at least 10 years of professional experience in software development. The ideal candidate will be proficient in a modern technology stack and capable of delivering high-quality cloud-based solutions in a fast-paced environment. This role is open to various work authorizations including USC, GC, EAD, and H1B, and supports W2, 1099, or C2C arrangements. The candidate must have mandatory previous experience with Capital One projects. This background is essential for navigating the specific technical and procedural requirements of the client's environment. The role involves full-stack development duties, integrating robust backend services with modern frontend frameworks. As a Java with AWS Developer, you will be responsible for designing, developing, and deploying scalable applications while ensuring seamless integration with cloud services.
Key Requirements
Minimum 10 years of experience in software development focusing on Java.
Extensive professional experience with Amazon Web Services (AWS) cloud infrastructure.
Mandatory previous experience working on Capital One projects.
Proficiency in frontend technologies including React and Typescript.
Hands-on programming experience with Golang.
Must be able to work onsite in McLean, Virginia.
Valid US Work Authorization (USC, GC, EAD, or H1B).
Strong understanding of microservices and full-stack architecture.
Experience with DevOps tools and CI/CD pipelines.
Excellent communication skills and the ability to work in a collaborative team environment.
0 Negotiable or Not Mentioned
USA, Malvern, PA
22 days ago
judge.com
1215 Views
This position is for an AWS Data Analytics Engineer located in Malvern, Pennsylvania. The role follows a hybrid model requiring the candidate to be onsite from day one. The initial contract length is for one year, with a strong likelihood of being extended for multiple years. The recruitment process includes a video interview, and candidates are welcome to apply via C2C arrangements.
Technical responsibilities focus on utilizing Python and SQL for complex data queries and manipulation. The successful candidate will also be responsible for creating data visualizations and dashboards using Tableau and leveraging various AWS cloud services to manage and analyze large datasets. Applicants must submit their resume along with a copy of their work authorization for consideration.
Key Requirements
Proficiency in Python programming for data engineering tasks.
Strong expertise in SQL for performing complex data queries.
Extensive experience with Tableau for data visualization and reporting.
In-depth knowledge of AWS services related to data analytics.
Ability to work onsite in Malvern, PA following a hybrid model.
Minimum of 1+ year experience in a similar data analytics role.
Experience with cloud-based data warehousing and architecture.
Strong analytical and problem-solving skills.
Ability to participate and perform well in video interviews.
Must provide valid work authorization documentation.
Experience with C2C project delivery models.
Excellent communication and collaboration skills.
~11,666 Mentioned
United States, New York
7 days ago
gmail.com
1146 Views
We are actively seeking a highly skilled Senior Data Engineer to build and scale modern data infrastructure for a fast-growing organization within the Financial Services and Data & Analytics industry. In this role, you will play a critical part in designing, developing, and optimizing data pipelines and architectures that support advanced analytics and critical business intelligence initiatives. You will be responsible for ensuring the scalability and performance of data systems while maintaining the highest standards of data quality and governance.
The ideal candidate will have extensive experience in building scalable ETL/ELT pipelines and maintaining robust data warehouses and data lakes. You will work with large-scale structured and unstructured datasets, collaborating closely with data scientists and analysts to provide the foundational data structures needed for complex modeling. The position offers a competitive package ranging from $140,000 to $200,000 annually, plus bonuses and full benefits, based in New York.
Key Requirements
5+ years of professional experience in data engineering roles.
Strong proficiency in programming languages, particularly Python.
Advanced knowledge of SQL for complex data manipulation and querying.
Hands-on experience with Apache Spark for large-scale data processing.
Extensive experience with cloud platforms such as AWS, Azure, or GCP.
Proven track record with data warehousing solutions and architecture.
Strong understanding of big data technologies and distributed systems.
Ability to design and build scalable ETL and ELT pipelines.
Proficiency in maintaining and optimizing data lakes for performance.
Excellent collaboration skills for working with data scientists and analysts.
Experience in ensuring data quality, integrity, and corporate governance.
0 Negotiable or Not Mentioned
USA, Pittsburgh
24 days ago
skilzmatrix.com
2067 Views
PNC is currently seeking a highly experienced Super Senior Data Engineer with over 10 years of professional experience to join their team in Pittsburgh, PA. The successful candidate will play a critical role in designing, building, and maintaining scalable data pipelines leveraging the full suite of AWS cloud services. This position involves developing and optimizing sophisticated ETL and ELT workflows to handle both structured and semi-structured data, ensuring that high-performance analytics are available for business decision-making. Working within an agile environment, the role demands a expert-level understanding of data processing jobs using Python and PySpark.
In addition to pipeline construction, the engineer will be responsible for integrating and managing data within the Snowflake cloud data warehouse. This includes writing complex SQL queries for data transformation and validation, as well as supporting Power BI dashboards by delivering curated, analytics-ready datasets. Candidates must demonstrate a strong commitment to data quality, governance, performance, and security best practices. This role is offered on a W2 basis and is ideal for individuals with prior experience in the financial services or banking domain who are looking to apply their technical leadership in a dynamic corporate environment.
Key Requirements
Minimum of 10 years of professional experience in Data Engineering or a related field.
Advanced proficiency in SQL, including complex querying and performance tuning.
Extensive experience designing and maintaining scalable data pipelines on AWS.
Expert knowledge of Python and PySpark for large-scale data processing.
Hands-on experience with Snowflake cloud data warehouse management and integration.
Proven ability to develop and optimize ETL/ELT workflows for various data formats.
Experience supporting Power BI through data modeling and performance optimization.
Familiarity with AWS services such as S3, Glue, EMR, Lambda, and Redshift.
Strong understanding of data quality frameworks, governance, and security best practices.
Ability to work effectively in an Agile/Scrum environment with cross-functional teams.
0 Negotiable or Not Mentioned
USA, Philadelphia, PA
16 days ago
apptadinc.com
1168 Views
Apptad Inc is seeking a highly skilled Sr. Full Stack Developer to join our team in Philadelphia, PA, in a hybrid capacity. This role is ideal for a veteran developer with over 10 years of experience looking to lead complex technical initiatives. You will be responsible for building advanced data pipelines and ETL processes using Airflow and Snowflake, while also supporting the development of sophisticated web applications using React, Material UI, and AngularJS. The position involves collaborating across multiple teams to ensure the delivery of high-quality software solutions and the resolution of intricate technical problems.
In this hybrid role, you will play a key part in implementing process improvements and driving automation across the development lifecycle. Candidates should possess a deep understanding of cloud environments, specifically AWS, and be proficient in containerization technologies such as Docker and Kubernetes. Given the nature of our projects, experience within the financial services or asset management industry is considered a significant advantage. You will also have the opportunity to utilize AI-enabled development tools like Copilot and Claude to enhance productivity and innovation within our tech stack.
Key Requirements
Minimum of 10 years of professional experience in full-stack software development.
Expert proficiency in Python and frameworks such as Django.
Extensive experience with front-end technologies including React, Angular, and Material UI.
Strong background in AWS services including EC2, S3, Lambda, SNS, and SQS.
Demonstrated expertise in containerization using Docker and orchestration with Kubernetes.
Proven experience with CI/CD tools like Jenkins, Gitlab, or Bamboo.
Deep knowledge of database systems including Snowflake, Redshift, and SQL.
Hands-on experience building and managing data pipelines with Apache Airflow.
Strong understanding of REST API design and DevOps best practices.
Familiarity with AI-enabled development tools such as Copilot or Claude.
Experience in the Financial Services or Asset Management domain is highly preferred.
Excellent collaborative skills and the ability to troubleshoot complex technical issues.
0 Negotiable or Not Mentioned
USA, Mclean, VA
14 days ago
momentousa.com
1330 Views
Momentousa is hiring a Senior Data Engineer for an onsite W2 position located in McLean, VA. This role is ideal for a seasoned professional with deep expertise in big data technologies and data warehousing. You will be responsible for designing, building, and maintaining scalable data pipelines that process vast amounts of information to support our analytics and business intelligence initiatives.
The successful candidate will work extensively with AWS services, Spark, and Pyspark to transform raw data into actionable insights. You will leverage your advanced SQL skills and Python knowledge to model data and optimize database performance. This role demands a high level of technical proficiency in Hive and general data modeling principles to ensure our data architecture is robust, efficient, and capable of supporting complex business queries.
Key Requirements
Significant experience with AWS cloud data services
Expert-level knowledge of Spark and Pyspark for data processing
Advanced proficiency in SQL, including basic and complex query optimization
Strong backend development skills using Python
Practical experience with Hive for data warehousing and querying
Proven ability in Data Modelling and architecture design
Experience building and maintaining robust ETL pipelines
Knowledge of performance tuning for big data applications
Ability to work onsite in McLean, VA on a regular basis
Strong analytical skills to interpret complex data sets
0 Negotiable or Not Mentioned
USA, McLean, VA
28 days ago
S3Connections.com
2303 Views
We are seeking a highly skilled Senior Data Engineer to design, develop, and optimize scalable data platforms that transform complex data into meaningful business insights. The ideal candidate will have strong expertise in SQL, Python, and ETL development, along with experience supporting cloud-based data migration and modern data ecosystems. You will be responsible for building and maintaining scalable ETL/data pipelines for structured and unstructured data while ensuring high-performance data solutions through advanced techniques. The role requires a presence onsite in McLean, VA, for five days a week to ensure close collaboration with team members and stakeholders.
The role involves collaborating with cross-functional teams to enhance data quality, accessibility, and system performance. You will implement best practices for data engineering, code quality, testing, and deployment. Additionally, the candidate will support cloud data migration initiatives, including data mapping, transformation, validation, and optimization. This position is critical for optimizing data workflows and ensuring high availability and reliability of data systems within an enterprise environment. Candidates should be prepared to create and maintain comprehensive technical documentation and data flow diagrams to support the platform's evolution.
Key Requirements
8+ years of experience as a Data Engineer
Strong expertise in SQL and Python
Hands-on experience building and maintaining ETL pipelines in enterprise environments
Experience working with large datasets and complex data architectures
Experience with cloud platforms such as AWS, Azure, or GCP
Strong understanding of data modeling, data warehousing, and data transformation techniques
Experience in data migration and integration projects
Excellent problem-solving, analytical, and communication skills
Familiarity with orchestration tools like Airflow
Experience with CI/CD tools such as GitHub or Jenkins
0 Negotiable or Not Mentioned
USA, New York
14 days ago
primesoftconsulting.com
914 Views
We are looking for a highly skilled Full Stack Developer to join our dynamic team in New York. The ideal candidate will be responsible for developing and maintaining both front-end and back-end applications, ensuring high performance and responsiveness to requests from the front-end. You will work closely with other developers and stakeholders to deliver high-quality software solutions that meet business requirements. The role requires a deep understanding of modern web technologies, including .NET and Python for backend services, and React or Angular for front-end development. You should be comfortable working with databases, designing schemas, and optimizing SQL queries. Additionally, experience with RESTful APIs, CI/CD pipelines, and Git is essential for this position.
Key Requirements
Proficient in Backend technologies such as .NET and Python.
Strong experience with Frontend frameworks like React and Angular.
Expertise in modern JavaScript frameworks.
Solid understanding of database systems and SQL.
Experience in schema design and performance tuning.
Proficiency in API development and REST integration patterns.
Knowledge of DevOps practices, including Git.
Experience with CI/CD pipelines.
Strong testing discipline for software quality.
Excellent problem-solving and analytical skills.
0 Negotiable or Not Mentioned
USA, Newark, NJ
24 days ago
sapphiresoftwaresolutions.com
1321 Views
Sapphire Software Solutions is seeking a seasoned Java Backend Technical Lead to spearhead the design, development, and deployment of high-performance, scalable backend applications. This role is based in Newark, NJ, and requires a dedicated professional to work onsite five days a week. You will be responsible for leading the technical direction of the backend team, ensuring that microservices and Spring Boot applications are architected for maximum efficiency and scalability. The ideal candidate will have extensive experience in distributed systems and a proven ability to mentor junior developers while delivering mission-critical software solutions.
A significant portion of this role involves building and managing sophisticated event-driven systems utilizing Apache Kafka. This includes the development of producers, consumers, and listeners to facilitate seamless message processing. You will oversee the persistence of consumed events into PostgreSQL and other relational or NoSQL databases, ensuring data integrity and system reliability. Furthermore, you will be expected to implement robust fault-tolerant designs and maintain high standards for CI/CD pipelines and build automation. As a technical leader, you will collaborate with stakeholders to define system integrations and drive the adoption of best practices across the development lifecycle.
Key Requirements
12+ years of hands-on experience in Java backend development.
Strong expertise in Java, Spring Boot, and Microservices architecture.
Hands-on experience with Apache Kafka, including producers, consumers, and listener patterns.
Proven experience in building and maintaining event-driven systems.
Strong experience with REST API development and complex system integration.
Expert knowledge of PostgreSQL and relational database management systems.
Deep understanding of distributed systems and message-driven architecture.
Expertise in designing fault-tolerant systems and high-availability solutions.
Hands-on experience with CI/CD pipelines and modern build tools.
Ability to lead the design, development, and deployment phases of large-scale projects.
Experience with NoSQL databases for event persistence.
Excellent communication and leadership skills to manage technical teams.
0 Negotiable or Not Mentioned
USA, New York
1 day ago
cloudrover.io
97 Views
Cloud Rover is currently seeking a highly skilled Kafka Administrator to join our technical team in New York. In this role, you will be responsible for the setup, configuration, and maintenance of Kafka clusters to ensure high availability and performance for our streaming data pipelines. You will work closely with developers and operations teams to optimize messaging throughput and resolve any performance bottlenecks within the environment. Applicants must possess valid work authorization such as USC, GC, GC EAD, H4 EAD, or OPT EAD to be considered for this position.
Your daily responsibilities will include managing ZooKeeper instances, monitoring cluster health using industry-standard tools, and implementing robust security measures such as SSL and SASL. You will also be expected to automate routine tasks using scripting languages like Python or Shell and participate in disaster recovery planning. This position offers a unique opportunity to work on large-scale distributed systems in a fast-paced environment. Please ensure your application includes your LinkedIn profile, current location, and your specific work authorization status.
Key Requirements
Proven experience in managing and scaling Apache Kafka clusters.
Deep understanding of ZooKeeper and its role in Kafka orchestration.
Experience with Kafka security features including SSL, SASL, and ACLs.
Proficiency in Linux system administration and command-line tools.
Expertise in monitoring tools like Prometheus, Grafana, or Confluent Control Center.
Ability to write automation scripts using Python, Bash, or Shell.
Knowledge of data retention policies and Kafka topic configuration.
Experience with backup, restore, and disaster recovery procedures for Kafka.
Familiarity with containerization technologies like Docker or Kubernetes.
Strong troubleshooting skills for resolving connectivity and performance issues.
Excellent communication skills for collaborating with cross-functional teams.
Valid US work authorization (USC, GC, GC EAD, H4 EAD, or OPT EAD).
0 Negotiable or Not Mentioned
USA, Harrisburg, PA
23 days ago
dsiginc.com
1648 Views
DSIG Inc is seeking a highly skilled and experienced Senior Data Engineer to join our team for a direct client project located in Harrisburg, PA. This role is a hybrid position, requiring the candidate to be local to the area and possess a valid Pennsylvania Driver’s License. The successful candidate will be responsible for designing, building, and maintaining robust data architectures and pipelines that support large-scale data processing. Experience with the State of Pennsylvania is a mandatory requirement for this role, as the candidate will be working closely with government-related data systems and processes. In this position, you will leverage your expertise in SQL, ETL processes, and various data warehousing technologies to ensure data integrity and availability. You will also participate in face-to-face interviews and collaborate with multi-functional teams to translate business requirements into technical solutions. We are looking for a professional who is not only technically proficient but also an excellent communicator. If you have a passion for data engineering and meet the residency and licensing requirements, we encourage you to apply by sending your resume to the provided contact.
Key Requirements
Must possess a valid Pennsylvania Driver’s License.
Mandatory experience working with the State of Pennsylvania.
Candidate must be local to Harrisburg, PA for hybrid work.
Proven experience as a Senior Data Engineer or similar role.
Expertise in designing and maintaining scalable data pipelines.
Strong proficiency in SQL and relational database management.
Experience with ETL tools and data integration techniques.
Ability to attend face-to-face interviews in Harrisburg.
Experience with cloud-based data platforms (e.g., AWS, Azure, GCP).
Bachelor's degree in Computer Science, Information Technology, or related field.
0 Negotiable or Not Mentioned
USA, New York
22 days ago
fusionplusinc.com
1485 Views
The Performance Engineering team is looking for a Lead AI-Driven Performance Engineer to serve as a Senior Consultant. The primary focus of this role is the development and scaling of PerfHub, an innovative AI-enabled ecosystem designed to automate end-to-end performance and resiliency validation. This platform aims to streamline the detection of application changes, automate test selection, and handle complex execution, analysis, and documentation tasks. The successful candidate will blend deep performance engineering expertise with cutting-edge AI technologies to drive efficiency across the entire organization.
Key responsibilities include utilizing Large Language Models and OpenAI tools to automate data-driven workflows and integrate testing with monitoring and analytics. You will work closely with application and infrastructure teams to define resiliency KPIs and provide high-level insights into performance trends. This position requires a strong technical background in Python and enterprise application architectures, particularly within regulated environments like financial services. You will be instrumental in transforming traditional testing practices into highly automated, scalable, and intelligent processes.
Key Requirements
8+ years of experience in Performance Engineering or related fields.
Strong foundation in performance testing, analysis, and certification.
Experience in building or contributing to automation platforms or frameworks.
Solid understanding of enterprise application architectures (web, distributed, cloud, databases).
Proficiency in Python for automation, orchestration, and data analysis.
Familiarity with performance tools and observability platforms.
Strong analytical skills and ability to collaborate across teams in a large enterprise.
Experience applying AI/ML or LLMs to engineering or testing workflows.
Familiarity with CI/CD integration and resiliency concepts.
Ability to translate technical outcomes into executive-level insights.
Experience in regulated financial services environments.
0 Negotiable or Not Mentioned
USA, New York
23 days ago
intellyk.com
1785 Views
Intellyk is seeking a Senior Java Developer for a high-impact onsite role at New York Plaza. This position involves intensive backend development using Core Java, Spring, and Hibernate to build scalable and high-quality solutions. The candidate will be an integral part of the development lifecycle, collaborating closely with Product and BA teams while participating in Agile ceremonies. This role demands a strong background in coding and implementation, specifically targeting individuals who can handle complex problem-solving and design patterns effectively. Beyond backend expertise, the role requires approximately 20-30% exposure to frontend technologies like Angular or React. A unique aspect of this position is the integration of modern AI tools such as GitHub Copilot and Prompt Engineering into the development workflow. Candidates must be local to the New York area to facilitate a mandatory in-person final interview. This is a great opportunity for a seasoned developer to work in a fast-paced environment and contribute to innovative enterprise-level applications.
Key Requirements
Strong proficiency in Core Java development
Extensive experience with Spring Framework and Hibernate
Familiarity with UI frameworks such as Angular or React
Hands-on experience with AI tools like OpenAI and GitHub Copilot
Proven track record in coding, implementation, and software design patterns
Experience with RESTful APIs and Web Services integration
Knowledge of SQL databases, specifically DB2 or Sybase
Proficiency with Agile methodologies and DevOps environments
Experience with JUnit and various profiling tools for performance
Ability to work onsite in New York and attend in-person interviews
0 Negotiable or Not Mentioned
USA, Philadelphia
24 days ago
keypixelusa.com
1591 Views
We are seeking a highly skilled and experienced Data Architect with specific expertise in the Health Care Payer Domain to join our team in Philadelphia, PA. The ideal candidate will have over 15 years of professional experience in data architecture and a deep understanding of how to manage and structure data within the healthcare insurance sector. You will be responsible for designing, creating, deploying, and managing our organization's data architecture, ensuring that it is robust, scalable, and meets the specific needs of the payer domain.
In this hybrid role, you will leverage your expertise in various database technologies and cloud platforms such as Azure, GCP, and AWS to build modern data environments. Your work will involve collaborating with cross-functional teams to integrate disparate data sources, improve data quality, and support analytical initiatives. You must be adept at translating business requirements into technical specifications and have a proven track record of delivering high-quality data solutions in complex environments.
Key Requirements
Minimum of 15 years of professional experience in Data Architecture.
Must have deep expertise specifically within the Health Care Payer Domain.
Advanced proficiency with cloud platforms including Azure, GCP, and AWS.
Strong background in both relational and non-relational database design.
Extensive experience in building and optimizing scalable data pipelines.
Mastery of various data modeling tools and architectural methodologies.
Comprehensive knowledge of healthcare data standards and security regulations.
Proven ability to lead and deliver large-scale data transformation projects.
Excellent stakeholder management and technical communication skills.
Ability to work effectively in a hybrid office environment in Philadelphia.
0 Negotiable or Not Mentioned
USA, Raleigh
24 days ago
rgtalent.com
1313 Views
RgTalent Inc is seeking a highly skilled Senior Oracle DBA with Exadata expertise for a long-term contract role based in Raleigh, NC. This position operates on a hybrid model and requires the candidate to be local to the area. The successful candidate will be responsible for managing and maintaining complex Oracle database environments across development, test, and production stages, primarily running on Exadata and Linux within an AWS framework.
Key responsibilities include collaborating with application developers to design schemas, optimizing SQL queries, and leading modernization efforts for database scalability. The role also involves high-level technical support, implementation of disaster recovery using Oracle Data Guard, and managing replication via Oracle GoldenGate. Candidates must have extensive experience in database performance tuning and resource management for high-volume workloads to ensure system availability and efficiency.
Key Requirements
Manage and maintain Oracle database environments on Exadata and Linux platforms.
Deep expertise in managing databases within AWS cloud environments.
Collaborate with application developers to define data requirements and design tables.
Assist with complex SQL development and performance optimization.
Lead the design, implementation, and management of application schemas and database structures.
Provide technical support for Oracle databases, including diagnosis of performance issues.
Develop and maintain procedures and best practices for database operations in the cloud.
Administer high-availability and disaster recovery configurations using Oracle Data Guard.
Implement and manage replication solutions such as Oracle GoldenGate.
Perform regular database backups, restores, patching, and upgrades.
Must be local to Raleigh, NC for a hybrid onsite role.
Possess valid work authorization (USC-GC-H4EAD-GCEAD-TN-OPTEAD).
0 Negotiable or Not Mentioned
USA, Jersey City
24 days ago
esharpedge.com
1364 Views
Sharpedge Inc is seeking a highly skilled Senior Rancher Platform Engineer to join our team in Jersey City, NJ. In this role, you will be responsible for managing and optimizing Rancher-managed Kubernetes clusters, including RKE and RKE2 environments. You will leverage the Rancher UI, APIs, and automation workflows to ensure robust and scalable infrastructure. The ideal candidate will have extensive experience in networking and observability stacks, utilizing tools like Prometheus, Grafana, and ELK to monitor system health and performance.
Additionally, you will play a key role in designing and implementing CI/CD and GitOps workflows using Helm, Jenkins, GitHub Actions, and Argo CD. As a senior member of the team, you will contribute to the continuous improvement of our deployment strategies and container orchestration. This position requires being on the Sharpedge Payroll. If you have a passion for Kubernetes and infrastructure automation, we encourage you to apply and help drive our platform's evolution.
Key Requirements
Experience with Rancher-managed Kubernetes clusters including RKE and RKE2.
Proficiency in Rancher UI, APIs, and automation workflows.
Solid understanding of networking concepts in a containerized environment.
Hands-on experience with observability stacks including Prometheus and Grafana.
Experience with centralized logging systems such as EFK or ELK stacks.
Proven track record with CI/CD and GitOps workflows using Helm and Jenkins.
Expertise in GitHub Actions and Argo CD for deployment automation.
Must be eligible to work on Sharpedge Inc payroll.
Strong knowledge of infrastructure as code (IaC) principles.
Excellent troubleshooting skills in cloud-native environments.
0 Negotiable or Not Mentioned
USA, New York City
14 days ago
univedgeconsulting.com
841 Views
As a Lead Application Architect, you will be responsible for designing and maintaining robust software application architectures that meet complex business requirements. You will take a lead role in data modeling, integration, and governance efforts to ensure data quality and consistency across the enterprise. This role requires a hybrid or on-site presence in New York City, working closely with stakeholders, developers, and IT teams to ensure the successful execution of technical projects.
Your expertise will be vital in ensuring the compatibility and integration of software components and data systems. You will monitor and enhance the performance, quality, and responsiveness of applications while developing detailed architecture models and guidelines. The ideal candidate will have strong Python full-stack solution architect skills, mandatory MongoDB expertise, and preferably experience within Google Cloud Platform environments. This position also offers the opportunity to work with climate data, adding a meaningful layer to the architectural challenges.
Key Requirements
Proven experience as a Python fullstack solution architect.
Mandatory expertise in MongoDB and database technologies.
Strong knowledge of Google Cloud Platform (GCP) or similar cloud services.
Ability to lead data modelling, integration, and governance efforts.
Proficiency in modern data processing technologies and Meta-data driven modelling.
Expertise in microservices architecture and RESTful API design.
Solid understanding of front-end and back-end frameworks such as React, Angular, and Node.js.
Familiarity with CI/CD pipelines and DevOps practices.
In-depth knowledge of security best practices including IAM, encryption, and firewalls.
Experience with data orchestration, pipeline tools, and agile development methodologies.
0 Negotiable or Not Mentioned
USA, Stamford
30 days ago
aetalentsgroup.com
1825 Views
We are seeking a highly skilled Snowflake Developer to join our dynamic team for a contract duration of 6 or more months. This role is designed for a technical expert with a customer-focused mindset who can deliver excellent service to clients, partners, and stakeholders. You will be responsible for managing and resolving support tickets within SLA guidelines while working with critical integrations like Active Directory, LDAP, Outlook, Word, Excel, and Salesforce. The position requires a candidate who can handle customer calls professionally and track issue resolutions effectively to ensure high levels of client satisfaction. In addition to development tasks, you will support software licensing and installations, perform routine server installations, and conduct necessary maintenance. Maintaining accurate documentation of all customer interactions is a key part of the role, as is the ability to troubleshoot and escalate complex technical issues to the appropriate channels. The environment is fast-paced, demanding a high attention to detail and the ability to multitask across various web-based technologies and enterprise tools. This opportunity allows for work based in Stamford or remotely, providing flexibility for the right candidate with the necessary experience and skills.
Key Requirements
Minimum of 8 years of professional experience in technical development roles.
Proven expertise as a Snowflake Developer with deep platform knowledge.
Strong troubleshooting and analytical skills to resolve complex technical issues.
Excellent verbal and written communication skills for stakeholder interaction.
Ability to multitask and maintain organization in a fast-paced environment.
High attention to detail regarding technical documentation and ticket tracking.
Customer-centric mindset with a proactive problem-solving attitude.
Familiarity with web-based technologies and standard enterprise software tools.
Hands-on experience with Active Directory and LDAP integrations.
Proficiency in Microsoft Office Suite including Word, Excel, and Outlook.
Ability to manage software licensing and perform server maintenance tasks.
Experience working with Salesforce and other CRM integrations.
0 Negotiable or Not Mentioned
USA, Virginia
15 days ago
ar-sys.com
1052 Views
AR systems is currently seeking a highly skilled Developer-Java/J2EE Specialist for a contract-based position located in Mclean, Virginia. This long-term contract is scheduled to run through October 30, 2026, with the potential for further extensions. We are specifically looking for local candidates residing within a 30-35 mile radius of Mclean, as the selection process includes an onsite interview and the role requires local presence. Applicants must be direct W-2 employees to our vendors and must hold either US Citizenship or a Permanent Resident Card.
The successful candidate will be responsible for full-cycle backend development using Java 8 and above. Key tasks include building and consuming robust RESTful API web services, working extensively with the Spring Framework and Dependency Injection, and managing build/test processes with tools like Maven, Gradle, and JUnit. Additionally, the role involves performing research and development using AI technologies, including prompt engineering and model evaluation to integrate AI capabilities into existing workflows. Candidates must possess excellent problem-solving skills and the ability to articulate technical concepts to diverse stakeholders.
Key Requirements
7–10 years of professional software development experience.
Bachelor’s degree in Computer Science or a related field (or equivalent experience).
Strong professional experience with Java backend development.
Experience building RESTful APIs using common Java frameworks like Spring or Spring Boot.
Outstanding expertise in Java 8+ including multithreading, concurrency, and collections.
Strong knowledge of Spring Framework and Dependency Injection.
Proficiency with build and test tools such as Maven, Gradle, JUnit, and Mockito.
Proven ability to perform R&D using AI, including prompt engineering and model evaluation.
Excellent problem-solving, analytical, and technical skills.
Strong oral and written communication skills to explain technical concepts.
Must be a US Citizen or hold a Permanent Resident Card (Green Card).
Must be a direct W-2 employee and residing within 30-35 miles of Mclean, VA.
0 Negotiable or Not Mentioned
USA, North Carolina
20 days ago
cynetsystems.com
1058 Views
We are seeking a highly skilled and experienced Middleware Developer with a focus on the Java ecosystem to join our technical team. This position is a remote role; however, it is strictly required that the candidate is currently local to the state of North Carolina, as relocation assistance or out-of-state hiring is not available for this specific project. The successful candidate will play a critical role in developing and managing robust middleware solutions that bridge the gap between various enterprise systems and external trading partners.
The ideal candidate must demonstrate expert-level proficiency in Java/J2EE and the Spring Boot framework to build scalable and high-performance integration services. Key responsibilities include managing B2B/EDI processes, configuring Enterprise Service Bus (ESB) solutions, and overseeing API Gateway implementations. You will be expected to design secure communication channels, troubleshoot complex integration issues, and ensure the seamless flow of data across the organization’s digital infrastructure. Collaboration with cross-functional teams is essential to align middleware strategies with broader business objectives.
Key Requirements
Strong experience with B2B and EDI integration protocols.
Proven expertise in managing Enterprise Service Bus (ESB) architectures.
Hands-on experience with API Gateway configuration and management.
Advanced proficiency in Spring Boot and the Java/J2EE ecosystem.
In-depth knowledge of Trading Partner setup and management.
Ability to design and implement secure middleware communication protocols.
Experience with XML and JSON data transformation techniques.
Strong debugging and performance tuning skills for distributed systems.
Excellent written and verbal communication skills for a remote environment.
Must be a resident of North Carolina, USA, with no relocation required.
0 Negotiable or Not Mentioned
USA, Arden
13 days ago
flexontechnologies.com
1422 Views
Flexon Technologies is currently hiring for the position of Databricks Python Engineer to join their team for a long-term project in Arden, Delaware. This is a Dayone Onsite role, requiring the candidate to be physically present at the location from the start of the engagement. The position is tailored for senior professionals with over 10 years of experience who can provide high-level consultancy and technical expertise. The focus of the role is within the Retail Digital domain, specifically integrating complex data solutions to improve business operations.
The technical requirements include a deep proficiency in Python, Databricks, and PL/SQL, alongside experience with ServiceNow and Aptos Store Inventory Management Systems. Candidates are expected to have a strong background in retail digital environments, as resumes without this domain expertise will not be considered. The role is offered at an hourly rate of 55/hr C2C. The successful candidate will be responsible for developing and managing data pipelines, optimizing store inventory systems, and ensuring seamless integration across various enterprise platforms.
Key Requirements
10+ years of hands-on experience as a technical consultant.
Deep expertise in Python programming for data engineering tasks.
Advanced proficiency with Databricks for processing large datasets.
Strong skills in writing and optimizing PL/SQL queries.
Prior experience working with ServiceNow platform.
Hands-on experience with Aptos Store Inventory Management System.
Essential domain experience in Retail Digital.
Ability to work onsite in Arden, DE from the first day of the contract.
Familiarity with Azure cloud infrastructure and services.
Understanding of big data ecosystems and Hadoop environments.
0 Negotiable or Not Mentioned
USA, Jersey City, NJ
11 days ago
gmail.com
490 Views
We are hiring a Java Developer to join our technical team in Jersey City, NJ. This position offers an excellent opportunity for both entry-level and experienced Java professionals to work on enterprise-level applications. We are committed to fostering an inclusive workplace and are open to candidates with F1, OPT, and CPT work authorizations.
In this role, you will be involved in the full development lifecycle of Java-based applications, from requirement gathering to deployment. You will collaborate with other developers and stakeholders to ensure the delivery of high-quality code. The ideal candidate is someone who is eager to take on technical challenges and contribute to a fast-paced development environment.
Key Requirements
In-depth knowledge of Core Java and Object-Oriented Programming
Experience with Spring Framework and Spring Boot
Understanding of Microservices architecture and RESTful APIs
Proficiency in relational databases such as MySQL or PostgreSQL
Experience with build tools like Maven or Gradle
Knowledge of unit testing frameworks like JUnit or TestNG
Familiarity with Agile development methodologies
Strong debugging and performance tuning skills
Bachelor's degree in Computer Science or equivalent
Good interpersonal and communication skills
0 Negotiable or Not Mentioned
USA, Malvern
28 days ago
judge.com
1310 Views
We are seeking a highly skilled AI/ML Data Scientist to join our team for a long-term engagement in Malvern, Pennsylvania. This is a hybrid role that requires the candidate to be onsite from day one to collaborate effectively with the local team. The successful applicant will be responsible for designing and implementing advanced machine learning algorithms and artificial intelligence models to extract valuable insights from complex datasets. You will work closely with cross-functional teams to identify business opportunities where AI and ML can drive significant impact and efficiency.
This contract position has an expected duration of over one year, providing a stable opportunity to contribute to high-impact projects. The selection process will include video interviews for shortlisted candidates. Applicants are required to submit a comprehensive resume along with a copy of their work authorization and a link to their professional LinkedIn profile. This role is ideal for individuals who thrive in a data-driven environment and possess a strong technical foundation in modern data science practices.
Key Requirements
Proficiency in Python, R, or similar programming languages for data analysis.
Proven experience in developing and deploying Machine Learning and AI models.
Ability to work onsite in Malvern, PA on a hybrid schedule from the first day.
Strong understanding of statistical concepts and probability theory.
Experience with data visualization tools such as Tableau, PowerBI, or Matplotlib.
Ability to clean, preprocess, and manage large-scale datasets efficiently.
A valid work authorization copy must be provided with the application.
Provision of a professional LinkedIn profile URL for background verification.
Excellent communication skills to translate technical findings to stakeholders.
Master’s or PhD degree in Computer Science, Data Science, or a related quantitative field.
0 Negotiable or Not Mentioned
USA, Harrisburg, PA
28 days ago
zeforge.com
2007 Views
This is a long-term position for a Senior .NET Application Developer (Technical Architect) based in Harrisburg, PA, supporting the State of Pennsylvania. The role requires local Harrisburg, PA profiles as candidates should reside within driving distance of the office to report on-site if or when needed. Applicants are required to attach a copy of their Driver's License and their LinkedIn profile link when submitting their resumes for consideration. The developer will participate in the full software development lifecycle, including requirements analysis, design, development, testing, and deployment within an enterprise environment.The technical environment for this role is robust, involving .NET/C#, Angular, TypeScript, and Entity Framework Core. The candidate will work extensively with REST APIs, Microsoft Azure, and Azure DevOps for CI/CD pipelines. Database management will involve Oracle, Azure SQL, and PostgreSQL databases, while integration tasks will include working with SAP BusinessObjects and enterprise data warehouse platforms. Proficiency in automation using Python or PowerShell is highly desired for this technical leadership role.
Key Requirements
10 or more years of professional IT experience in enterprise application development.
7 or more years of experience developing applications using .NET technologies such as C#, ASP.NET, and Web APIs.
Strong experience developing modern web applications using Angular or similar frameworks.
Extensive experience designing and optimizing solutions using relational databases such as Oracle, SQL Server, or Azure SQL.
Proven experience developing applications within Microsoft Azure environments.
Experience implementing source control, automated builds, and CI/CD pipelines using Azure DevOps.
Hands-on experience participating in the full software development lifecycle (SDLC).
Experience developing accessible web applications in compliance with WCAG guidelines.
Strong written and verbal communication skills for technical leadership and collaboration.
Experience developing automation or scripting solutions using Python or PowerShell.
Ability to integrate applications with business intelligence platforms such as SAP BusinessObjects.
Experience mentoring developers and providing technical leadership within software development teams.
0 Negotiable or Not Mentioned
USA, Mclean
15 days ago
ar-sys.com
1011 Views
AR systems is seeking a highly skilled Developer-Java/J2EE Specialist for a long-term contract role based in Mclean, VA. The successful candidate will focus on backend development, building robust RESTful APIs, and utilizing modern Java frameworks like Spring and Spring Boot. This position requires a professional who is proficient in Java 8+ and can handle complex multithreading and concurrency tasks within a fast-paced environment.
Beyond core development, the role involves performing research and development using AI technologies. This includes evaluating AI-assisted approaches, rapid prototyping, and integrating AI capabilities into existing workflows. Candidates must be local to the Mclean area, as an onsite interview is required. This contract assignment is scheduled to end on October 30, 2026, with a possibility for an extension.
Key Requirements
7–10 years of professional software development experience.
Bachelor’s degree in Computer Science or a related field (or equivalent experience).
Strong professional experience with Java backend development.
Experience building RESTful APIs and working with Spring or Spring Boot.
Expertise in Java 8+ including I/O, multithreading, and generics.
Proficiency with build and test tools such as Maven, Gradle, JUnit, and Mockito.
Proven ability to perform R&D using AI, including prompt engineering and prototyping.
Must be either a US Citizen or hold a Permanent Resident Card.
Excellent problem-solving, analytical, and technical communication skills.
Must be local to Mclean, VA, living within a 30-35 mile radius.
0 Negotiable or Not Mentioned
USA, New York
1 day ago
cloudrover.io
97 Views
Cloud Rover is seeking a Senior SharePoint Migration Specialist for a key project based in New York. This role involves planning and executing the migration of content from legacy SharePoint environments to SharePoint Online and Microsoft 365. You will be responsible for assessing existing structures, developing migration strategies, and ensuring data integrity throughout the transition process. The ideal candidate will have extensive experience with migration tools and a deep understanding of Microsoft's collaborative ecosystem. Applicants must have USC, GC, GC EAD, H4 EAD, or OPT EAD status.
In addition to the migration tasks, you will provide technical support for post-migration issues and help configure SharePoint sites to meet business needs. This includes managing permissions, setting up metadata structures, and automating workflows using Power Automate. You will collaborate with various departments to ensure a seamless user experience and provide training where necessary. If you are a SharePoint expert with a track record of successful migrations, we encourage you to apply with your resume, LinkedIn link, and current location.
Key Requirements
Proven experience in migrating legacy SharePoint versions to SharePoint Online.
Expertise in migration tools such as ShareGate, AvePoint, or Metalogix.
Strong knowledge of SharePoint Server (2013/2016/2019) and M365.
Proficiency in PowerShell scripting for SharePoint management and automation.
Understanding of SharePoint architecture, site collections, and taxonomies.
Experience with Microsoft Power Platform, including Power Automate and Power Apps.
Ability to troubleshoot complex migration errors and data discrepancies.
Knowledge of O365 security, compliance, and governance policies.
Strong documentation skills for migration planning and reporting.
Ability to work independently and manage project timelines effectively.
Excellent analytical and communication skills.
Valid US work authorization (USC, GC, GC EAD, H4 EAD, or OPT EAD).
0 Negotiable or Not Mentioned
USA, Whippany
27 days ago
gvrinfotek.com
1729 Views
GVR Infotek is seeking a highly skilled and experienced Senior Java Developer to join our team in a hybrid capacity based in Whippany, NJ. This senior-level role requires a professional with at least 12 years of experience in backend development, specifically focusing on building scalable and high-performance systems. The successful candidate will be responsible for designing and implementing microservices architectures and leveraging the latest Java versions to meet complex business requirements.
The position involves working extensively with Apache Kafka and other streaming platforms to manage data flow and system communication. You will be expected to demonstrate expertise in Spring Boot, multithreading, and SQL performance tuning. Additionally, you will play a key role in optimizing system concurrency and managing relational databases. This is an excellent opportunity for a seasoned developer to contribute to distributed systems and work with modern containerization technologies in a dynamic environment.
Key Requirements
Minimum of 12 years of experience in Java development.
Expertise in Java 11+ with strong backend development skills.
Extensive hands-on experience with Spring Boot and Spring Framework.
Solid understanding of REST APIs and Microservices Architecture.
Proven experience with Apache Kafka or similar streaming platforms.
Advanced knowledge of Multithreading and Concurrency management.
Proficiency in SQL and relational databases like PostgreSQL or SQL Server.
Ability to perform system performance tuning and optimization.
Experience working with Docker and Kubernetes for container orchestration.
Familiarity with Redis and high-performance distributed systems.
0 Negotiable or Not Mentioned
USA, Reston
23 days ago
hexaware.com
2272 Views
Hexaware is seeking an experienced Senior Salesforce Developer to join our team in Reston, VA. This role involves working on cutting-edge projects where you will be responsible for designing and implementing scalable technical solutions. You will collaborate with cross-functional teams using Agile/Scrum methodologies to deliver high-quality Salesforce applications and integrations. The ideal candidate will have over a decade of hands-on experience in the Salesforce ecosystem. You will lead development efforts using Apex, Lightning Web Components, and other advanced tools. You will also manage CI/CD pipelines and ensure the security and architectural integrity of the platform. If you are passionate about Salesforce and ready to elevate your career, we encourage you to apply.
Key Requirements
10+ years of hands-on Salesforce development experience
Proficiency in Apex, Visualforce, Lightning Web Components (LWC), and SOQL/SOSL
Experience with Salesforce APIs (REST/SOAP), OAuth, and integration patterns
Strong understanding of Salesforce architecture, including governor limits and security
Familiarity with CI/CD tools (e.g., Git, Jenkins, Salesforce DX)
Experience with Agile/Scrum methodologies
Ability to translate business requirements into scalable technical solutions
Salesforce Platform Developer I & II certifications
Strong problem-solving and analytical thinking
Excellent communication and collaboration skills