Best Talent Reach (BTR) Senior Cybersecurity Engineer at Alphosoft

Hiring? Post Your Job Here Join Our WhatsApp Channel

Top 10 Earners by Sharing Jobs To Other Platforms
Sort by:

SENIOR CYBERSECURITY ENGINEER @ ALPHOSOFT

0 Negotiable or Not Mentioned USA, Chantilly 2 hours ago alphosoft.com 23 Views

We are seeking a Senior Cybersecurity Engineer to join our security operations team in Chantilly, VA. This critical role requires at least 8 years of experience in information security, with a heavy focus on Application Security (AppSec) and cloud defense. You will be responsible for identifying vulnerabilities, implementing security controls, and ensuring that our applications and infrastructure are protected against evolving cyber threats.

The candidate will lead SAST and DAST scanning efforts and manage security across AWS, Azure, and GCP environments. Proficiency with security tools such as CrowdStrike, CyberArk, and Splunk is required for incident detection and management. You will also oversee Identity and Access Management (IAM) and ensure strict adherence to NIST and PCI-DSS compliance standards. This position is located in Chantilly, VA, and we are open to relocation for the right professional.

Key Requirements

8+ years of professional experience in cybersecurity engineering. Expertise in Application Security including SAST and DAST methodologies. Proven experience securing multi-cloud environments (AWS, Azure, GCP). Hands-on experience with CrowdStrike and CyberArk for endpoint security. Proficiency in using Splunk for security information and event management (SIEM). Experience with security scanning tools like Fortify and Checkmarx. Strong knowledge of Identity and Access Management (IAM) protocols. Deep understanding of NIST, PCI-DSS, and other compliance frameworks. Experience in vulnerability management and incident response procedures. Ability to perform complex security audits and risk assessments.
Similar Jobs

SENIOR DATA ENGINEER @ MANPOWERGROUP

~11,666 Mentioned United States, New York 7 days ago gmail.com 1231 Views

We are actively seeking a highly skilled Senior Data Engineer to build and scale modern data infrastructure for a fast-growing organization within the Financial Services and Data & Analytics industry. In this role, you will play a critical part in designing, developing, and optimizing data pipelines and architectures that support advanced analytics and critical business intelligence initiatives. You will be responsible for ensuring the scalability and performance of data systems while maintaining the highest standards of data quality and governance.

The ideal candidate will have extensive experience in building scalable ETL/ELT pipelines and maintaining robust data warehouses and data lakes. You will work with large-scale structured and unstructured datasets, collaborating closely with data scientists and analysts to provide the foundational data structures needed for complex modeling. The position offers a competitive package ranging from $140,000 to $200,000 annually, plus bonuses and full benefits, based in New York.

Key Requirements

5+ years of professional experience in data engineering roles. Strong proficiency in programming languages, particularly Python. Advanced knowledge of SQL for complex data manipulation and querying. Hands-on experience with Apache Spark for large-scale data processing. Extensive experience with cloud platforms such as AWS, Azure, or GCP. Proven track record with data warehousing solutions and architecture. Strong understanding of big data technologies and distributed systems. Ability to design and build scalable ETL and ELT pipelines. Proficiency in maintaining and optimizing data lakes for performance. Excellent collaboration skills for working with data scientists and analysts. Experience in ensuring data quality, integrity, and corporate governance.
Similar Jobs

SENIOR DATA MODELER / DATA ARCHITECT @ ALPHOSOFT

0 Negotiable or Not Mentioned USA, Columbus 2 hours ago alphosoft.com 23 Views

Alphosoft is hiring a Senior Data Modeler and Data Architect to lead our data strategy in Columbus, OH. With over 13 years of experience required, this role focuses on designing the conceptual, logical, and physical data models that support our large-scale enterprise systems. You will work closely with stakeholders to understand business requirements and translate them into scalable and efficient data architectures.

The role involves utilizing Snowflake and AWS for cloud data warehousing and leveraging tools like DBT and Databricks for transformation. You will use Erwin for advanced data modeling and Informatica for ETL processes, while applying Data Vault 2.0 methodologies to ensure data integrity. Knowledge of data governance tools like Collibra and reporting through Power BI is highly valued. The position is based in Columbus, OH, and we welcome applicants who are open to relocating.

Key Requirements

Minimum of 13 years of experience in data modeling and architecture. Expertise in cloud data warehousing specifically with Snowflake and AWS. Proficiency with DBT and Databricks for data transformation. Advanced skills in using Erwin for enterprise data modeling. Strong experience with Informatica for ETL development. Deep understanding of Data Vault 2.0 architecture and methodologies. Experience with data governance and metadata management tools like Collibra. Proficiency in Power BI for developing advanced data visualizations. Familiarity with industry-specific platforms such as Guidewire. Excellent analytical skills to design complex enterprise-level data schemas.
Similar Jobs
BTR Ultra Seeker

Opportunity Engine — Power Your Applications

Send 50 applications daily with no ads, supported by 10 AI-personalized letters. BTR ensures your profile reaches recruiters first, maximizing your chances of landing interviews.

Starting $1.99/mo Fast Hire Boost

SENIOR JAVA DEVELOPER @ ALPHOSOFT

0 Negotiable or Not Mentioned USA, Jersey City 2 hours ago alphosoft.com 52 Views

We are looking for a Senior Java Developer with Full Stack capabilities to join our development team in Jersey City, NJ. This role requires a seasoned developer with at least 10 years of experience in building enterprise-grade applications. You will be responsible for the full software development lifecycle, from designing front-end user interfaces to developing robust back-end microservices and managing database integrations.

The ideal candidate will have a strong command of Java and the Spring Boot framework, paired with advanced skills in modern front-end libraries like Angular and React. You will be tasked with deploying applications in containerized environments using Docker and Kubernetes within the AWS cloud ecosystem. Experience with high-performance messaging systems like Kafka and database management in Oracle and MongoDB is essential. This role is located in Jersey City, NJ, and relocation assistance is available for qualified candidates.

Key Requirements

At least 10 years of experience in Java and Full Stack development. Expertise in developing microservices using the Spring Boot framework. Proficiency in front-end development with Angular and React. Extensive experience with AWS cloud services and infrastructure. Hands-on experience with containerization using Docker and Kubernetes. Deep understanding of Kafka for event streaming and messaging. Proficiency in managing relational and NoSQL databases like Oracle and MongoDB. Strong skills in designing and consuming RESTful APIs. Experience with CI/CD pipelines and DevOps best practices. Ability to lead technical design discussions and mentor junior developers.
Similar Jobs

SENIOR DATA ENGINEER @ ALPHOSOFT

0 Negotiable or Not Mentioned USA, Columbus 2 hours ago alphosoft.com 33 Views

Alphosoft is currently seeking a highly experienced Senior Data Engineer to join our technical team in Columbus, OH. This role is designed for a professional with over a decade of hands-on experience in building and optimizing large-scale data systems. The successful candidate will be responsible for designing, constructing, and maintaining high-performance data pipelines that enable the business to leverage complex datasets for strategic decision-making.

The position requires extensive expertise in the modern data stack, specifically using PySpark, Databricks, and Snowflake to process and store data efficiently. You will work with cloud-native technologies including AWS Glue, Azure Data Factory, and Airflow for orchestration. The role involves implementing robust ETL processes and managing real-time data streaming through Kafka. This position is based in Columbus, OH, and we are open to candidates who are willing to relocate to the area.

Key Requirements

Minimum of 10 years of professional experience in Data Engineering. Deep proficiency in PySpark and Databricks for data processing. Hands-on experience with Delta Lake and Snowflake data warehousing. Expertise in cloud services including AWS Glue and Azure Data Factory. Proven experience with workflow orchestration tools such as Airflow. Strong knowledge of real-time data streaming technologies like Kafka. Proficiency in Infrastructure as Code using Terraform. Experience with data visualization tools like Power BI. Strong SQL skills and experience with relational database management. Ability to design and maintain scalable and reliable data architectures.
Similar Jobs

DATA ENGINEER @ CONVEX TECH INC.

0 Negotiable or Not Mentioned USA, New York 4 days ago convextech.com 533 Views

Convex Tech Inc. is seeking a skilled Data Engineer for a hybrid role based in New York. This position requires the candidate to work onsite three days a week and participate in an onsite interview process. The successful candidate will focus on designing and implementing scalable data pipelines within the Azure ecosystem, specifically utilizing Azure Databricks and Azure Data Factory. The role involves developing robust ETL/ELT workflows using Apache Spark and PySpark DataFrames to process large datasets efficiently while ensuring optimal performance and scalability.

Beyond core pipeline development, the Data Engineer will be responsible for maintaining data governance, security, and compliance. Key tasks include implementing data quality frameworks, managing data lineage, and supporting modern Lakehouse architectures. Candidates must possess a deep understanding of SQL-based transformations and Master Data Management (MDM) concepts to ensure data consistency and integrity across the organization. This is a contract-based opportunity for 6 months or more, specifically looking for USC or GC holders ready to work in a hybrid environment.

Key Requirements

Design and implement scalable data pipelines using Azure Databricks and Azure Data Factory. Develop and maintain robust ETL/ELT workflows using Apache Spark and PySpark DataFrames. Build and optimize data pipelines for efficient data ingestion and processing of large datasets. Utilize data governance tools to manage data access, security, compliance, and data lifecycle. Implement data quality frameworks and maintain data lineage across enterprise data platforms. Design and support modern data architecture using Lakehouse and distributed data processing. Develop high-performance Spark and SQL-based data transformation procedures. Apply Master Data Management (MDM) concepts to ensure data consistency and standardization. Must be a US Citizen or Green Card holder (USC/GC only). Willingness to work onsite in New York 3 days a week and attend an onsite interview.
Similar Jobs
BTR Resume Services

Is Your Resume Worth $10,000 More? Let's Find Out.

A poorly formatted CV costs you more than just time; it costs you thousands in lost salary potential. Our experts optimize your document to command higher offers and get you past the initial AI screen in 6 seconds.

Starting $2.99 Fast Hire Boost

SCRUM MASTER @ ANETCORP

0 Negotiable or Not Mentioned USA, New York 1 day ago anetcorp.com 231 Views

The company is seeking an experienced Scrum Master to facilitate and lead Agile delivery for Adobe Experience Manager (AEM) projects in a hybrid work environment based in New York. The successful candidate will be responsible for guiding AEM-focused Agile teams, ensuring smooth delivery cycles, and maintaining high levels of collaboration between technical and business stakeholders.

In this role, you will manage scrum ceremonies, track team velocity, and report key metrics to ensure project transparency. You will work closely with cross-functional teams including UX, QA, DevOps, and Content specialists to remove blockers and drive continuous improvement. Strong communication and conflict resolution skills are essential for managing client expectations and fostering a high-performing team culture.

Key Requirements

Certified Scrum Master (CSM) or equivalent certification Minimum of 3+ years of experience as a Scrum Master At least 1 year of experience working specifically with AEM projects Strong knowledge of Adobe Experience Manager (AEM) architecture Deep understanding of AEM components and workflows Expertise in facilitating Agile/Scrum ceremonies and coaching teams Hands-on experience with Jira, Confluence, Rally, or Azure DevOps Proven ability to manage and refine product backlogs Experience tracking team velocity and reporting delivery metrics Excellent communication, facilitation, and conflict resolution skills Experience working with cross-functional teams including UX and QA
Similar Jobs

UNIX/LINUX MIGRATION CONSULTANT @ RAVINI IT SOLUTIONS

0 Negotiable or Not Mentioned USA, New York 4 days ago ravinitsolutions.com 331 Views

We are seeking a dedicated and experienced UNIX/Linux Migration Consultant to join our team for large-scale infrastructure and data center projects. This role involves leading and supporting critical server migration initiatives within complex enterprise environments. The successful candidate will work on a long-term contract basis, focusing on the seamless transition of file systems, binaries, cron jobs, and middleware while ensuring minimal disruption to business operations.

The ideal candidate will be responsible for developing comprehensive cutover strategies, managing the full migration lifecycle, and designing robust High Availability and Disaster Recovery solutions. You will lead migration planning, mitigate risks, and coordinate with stakeholders to ensure project success. Additionally, you will provide essential post-migration Hypercare support and performance tuning across AIX, Linux, and Solaris environments to maintain optimal system health and reliability.

Key Requirements

4+ years of professional UNIX/Linux migration experience. Strong technical expertise in AIX, Linux, and Solaris operating systems. Proven experience with High Availability (HA) and Disaster Recovery (DR) infrastructure design. Strong background in large-scale data center migration projects and lifecycle management. Expertise in system troubleshooting, performance tuning, and dependency mapping. Ability to plan and execute migrations of file systems, binaries, and cron jobs. Experience developing detailed cutover strategies and risk mitigation plans. Strong stakeholder coordination and communication skills for lead roles. Ability to provide post-migration Hypercare and technical support. Familiarity with middleware configuration and enterprise server environments.
Similar Jobs