Best Talent Reach (BTR) 10 Jobs Found for "spark"

Hiring? Post Your Job Here Join Our WhatsApp Channel

Top 10 Earners by Sharing Jobs To Other Platforms
Sort by:

SYSTEMS ADMINISTRATOR – DATABRICKS PLATFORM @ VERAZ INC

0 Negotiable or Not Mentioned USA, Austin TX 7 days ago verazinc.com 203 Views

The company is seeking a highly experienced Systems Administrator specializing in the Databricks Platform for a hybrid role located in Austin, Texas. This position requires a professional with over ten years of experience, particularly those who have previously worked with state clients. The successful candidate will be responsible for managing Databricks workspaces and ensuring optimal platform performance through effective cluster management and job scheduling. Key responsibilities include implementing robust security measures using IAM, SCIM, and RBAC, as well as managing cloud integrations like S3. Applicants should possess deep knowledge of Spark and Databricks SQL for governance and monitoring purposes. Furthermore, proficiency in automation tools such as Terraform and CI/CD pipelines is essential for maintaining and scaling the environment efficiently.

Key Requirements

10+ years of professional experience in systems administration. Prior experience working with state clients is highly preferred. Strong hands-on experience with Databricks workspace administration. Proven ability in cluster management and performance tuning. Expertise in Identity and Access Management (IAM) and SCIM. Proficient in Role-Based Access Control (RBAC) implementation. Experience with cloud integrations, specifically Amazon S3. Solid knowledge of Apache Spark and Databricks SQL. Familiarity with security, governance, and monitoring best practices. Proficiency in automation tools including Terraform and CI/CD workflows.
Similar Jobs

JAVA, SCALA, AND C++ ENGINEER WITH NOSQL EXPERTISE @ TANISHA SYSTEMS

0 Negotiable or Not Mentioned USA, Austin, TX 8 days ago tanishasystems.com 375 Views

Tanisha Systems is currently seeking a highly skilled Software Engineer with expertise in Java, Scala, and C++ to join our team on-site in Austin, Texas. This is a contract position that requires a seasoned professional with at least 8 years of industry experience building scalable microservices or data-driven platforms. The ideal candidate will possess a strong foundation in computer science and have a proven track record of working on sophisticated problems and systems within a fast-paced environment.

The role involves working closely with NoSQL datastores such as Cassandra or DynamoDB and utilizing advanced data processing technologies including Kafka, Spark, and Flink. Candidates should be well-versed in cloud paradigms, specifically AWS or GCP, to craft and maintain robust, scalable systems. You will be responsible for end-to-end development, from architectural design to debugging and problem-solving, ensuring the reliability and performance of our data-driven infrastructure.

Key Requirements

Minimum of 8 years of industry experience in software engineering. Demonstrated experience building scalable microservices or data-driven platforms. Strong fundamental knowledge of computer science and software architecture. Hands-on proficiency in Java, Scala, and C++ programming languages. Deep expertise in NoSQL datastores such as Cassandra or DynamoDB. Proficiency in data processing technologies including Kafka, Spark, and Flink. Solid familiarity with cloud paradigms, specifically AWS or GCP environments. Proven ability to analyze and solve sophisticated technical problems. Excellent debugging and analytical skills for large-scale systems. Experience working in an onsite, collaborative development environment. Ability to manage contract-based deliverables within project timelines. Knowledge of distributed systems and high-availability design.
Similar Jobs

DATABRICKS ADMINISTRATOR (SYSTEMS ANALYST 3) @ AE TALENTS GROUP

0 Negotiable or Not Mentioned USA, Austin 10 days ago aetalentsgroup.com 411 Views

We are seeking an experienced Databricks Administrator to manage and support enterprise data, analytics, and AI/ML workloads in a cloud environment. The role focuses on the administration and configuration of Databricks workspaces, including the oversight of clusters, jobs, and platform governance to ensure high performance and reliability across the infrastructure. This is a hybrid position specifically for local candidates in the Austin, Texas area.

In addition to technical configuration, the candidate will manage user access through security controls, monitor platform health, and implement cost optimization strategies. You will play a critical role in supporting data engineers, analysts, and scientists while maintaining strict adherence to enterprise security standards. Key responsibilities include managing IAM/SCIM/RBAC access and utilizing DevOps automation tools such as Terraform to streamline environment management.

Key Requirements

Minimum of 8 years of professional experience in data or systems analysis. Extensive expertise in Databricks Administration specifically within an AWS environment. Demonstrated experience in Cluster Configuration and automated Job Scheduling. Strong understanding of Access Management using IAM, SCIM, and RBAC. Advanced proficiency in Apache Spark performance tuning and troubleshooting. Hands-on experience with Cloud Storage integration, particularly Amazon S3. Skilled in the use of Databricks SQL, Notebooks, and Job Orchestration tools. Proven ability to conduct Platform Monitoring and Performance Optimization. Solid knowledge of Data Security, Encryption, and enterprise Compliance standards. Experience with Infrastructure as Code and Automation using Terraform and CI/CD.
Similar Jobs
BTR Ultra Seeker

Extend Your Application Limit: Ultra Seeker Power

Break free from application limits with 50 daily submissions. Auto-submit to matching jobs and leverage 10 AI letters for maximum impact. Never miss an opportunity again.

Starting $1.99/mo Fast Hire Boost

SENIOR DATA ENGINEER @ MOMENTOUSA

0 Negotiable or Not Mentioned USA, Mclean, VA 14 days ago momentousa.com 1326 Views

Momentousa is hiring a Senior Data Engineer for an onsite W2 position located in McLean, VA. This role is ideal for a seasoned professional with deep expertise in big data technologies and data warehousing. You will be responsible for designing, building, and maintaining scalable data pipelines that process vast amounts of information to support our analytics and business intelligence initiatives.

The successful candidate will work extensively with AWS services, Spark, and Pyspark to transform raw data into actionable insights. You will leverage your advanced SQL skills and Python knowledge to model data and optimize database performance. This role demands a high level of technical proficiency in Hive and general data modeling principles to ensure our data architecture is robust, efficient, and capable of supporting complex business queries.

Key Requirements

Significant experience with AWS cloud data services Expert-level knowledge of Spark and Pyspark for data processing Advanced proficiency in SQL, including basic and complex query optimization Strong backend development skills using Python Practical experience with Hive for data warehousing and querying Proven ability in Data Modelling and architecture design Experience building and maintaining robust ETL pipelines Knowledge of performance tuning for big data applications Ability to work onsite in McLean, VA on a regular basis Strong analytical skills to interpret complex data sets
Similar Jobs

DATA ENGINEER @ VIVID TECHNOLOGIES

0 Negotiable or Not Mentioned USA 15 days ago vivid-technologies.com 867 Views

Join Vivid Technologies as a Data Engineer where we bridge the gap between talented professionals and top-tier W2 contracting opportunities in the USA. We focus on marketing candidates with strong technical backgrounds, offering a robust support system that includes resume polishing and technical interview coaching. Our goal is to place you in high-impact projects quickly while providing the tools needed for long-term career sustainability.

Candidates will benefit from a hybrid work structure, allowing for flexibility while engaging with projects in various regions. We offer continuous guidance from the initial marketing phase through the entire project lifecycle. This role is ideal for those who thrive in data-centric environments and are looking for a reliable partner to manage their professional marketing and placement in the United States.

Key Requirements

Must possess a valid GC, GC-EAD, H4-EAD, or USC visa status. Strong proficiency in Python, Scala, or Java programming languages. Advanced knowledge of SQL and database management systems. Experience with ETL tools and designing efficient data pipelines. Familiarity with Big Data technologies such as Hadoop or Spark. Hands-on experience with cloud platforms like AWS, Azure, or GCP. Understanding of data modeling and data warehousing concepts. Ability to troubleshoot complex data issues and optimize performance. Capability to work effectively in a hybrid team setting. Proactive approach to learning new technologies and project requirements.
Similar Jobs

DATA ENGINEER WITH AWS ICEBERG @ NEXTGEN IT INC

0 Negotiable or Not Mentioned USA, Chicago Illinois 15 days ago nextgenitinc.net 710 Views

NextGen IT Inc is seeking a highly experienced Data Engineer specializing in AWS Iceberg to join our team in Chicago, Illinois. This is a contract role (C2C) that requires a seasoned professional with over 10 years of experience in the field. The successful candidate will be responsible for building and maintaining robust data pipelines, utilizing AWS Iceberg for optimized data storage and management. You will work closely with cross-functional teams to deliver scalable data solutions that meet the complex needs of our clients.

Please note that this position requires a mandatory Face-to-Face (F2F) interview process in Chicago. Applicants must demonstrate deep technical knowledge of the AWS ecosystem, big data frameworks, and cloud-native data warehousing. As a senior-level consultant, you will be expected to lead data architecture discussions and ensure the highest standards of data integrity and performance across the organization.

Key Requirements

Over 10 years of professional experience in data engineering roles. Strong proficiency in AWS cloud services and specific expertise with AWS Iceberg. Ability to attend a mandatory Face-to-Face (F2F) interview in Chicago, Illinois. Advanced knowledge of ETL pipelines and data processing frameworks. Hands-on experience with SQL for complex data manipulation and querying. Proficiency in programming languages such as Python or Scala for data engineering tasks. Experience with big data technologies like Apache Spark or Hadoop. Understanding of data modeling, data warehousing, and lakehouse architectures. Excellent communication skills for collaborative project environments. Proven track record of managing large-scale datasets and ensuring data quality.
Similar Jobs
BTR Ultra Seeker

Extend Your Application Limit: Ultra Seeker Power

Break free from application limits with 50 daily submissions. Auto-submit to matching jobs and leverage 10 AI letters for maximum impact. Never miss an opportunity again.

Starting $1.99/mo Fast Hire Boost

DATA ENGINEER @ AMA GLOBAL TECH

0 Negotiable or Not Mentioned USA, New York City 24 days ago amaglobaltech.com 1709 Views

Ama Global Tech is seeking a skilled Data Engineer for a hybrid role located in New York City, NY. This position requires a professional who can design, build, and maintain scalable data pipelines and architectures. You will work closely with cross-functional teams to ensure data accessibility and quality, focusing on high-performance computing and cloud-based environments. The role involves a mix of remote work and onsite presence, specifically requiring local candidates capable of attending face-to-face interviews.

The ideal candidate will demonstrate mastery over the AWS ecosystem and the Databricks platform. You will be responsible for implementing data processing solutions using Spark and Python, while managing containerized applications with Docker and Kubernetes. We are looking for a proactive problem-solver who can navigate the complexities of data warehousing and data lakes to provide actionable insights for the business. A certification in Databricks Engineering is a significant plus for this position.

Key Requirements

Strong experience with AWS services including S3, Lambda, and EMR. Proficiency in Spark and Python for complex data engineering tasks. Solid understanding of data warehousing and data lake (DW/DH) concepts. Hands-on experience with Docker and Kubernetes for containerized environments. Certified Databricks Engineer is highly preferred. Excellent troubleshooting and debugging skills to resolve technical issues. Ability to attend a mandatory Face-to-Face (F2F) interview in New York City. Must be a local candidate currently residing in or near New York City. Eligible for C2C with H1 or W2 with GC/USC status. Strong communication skills for effective team collaboration.
Similar Jobs

SUPER SENIOR DATA ENGINEER (1 POSITION) @ PNC

0 Negotiable or Not Mentioned USA, Pittsburgh 24 days ago skilzmatrix.com 2063 Views

PNC is currently seeking a highly experienced Super Senior Data Engineer with over 10 years of professional experience to join their team in Pittsburgh, PA. The successful candidate will play a critical role in designing, building, and maintaining scalable data pipelines leveraging the full suite of AWS cloud services. This position involves developing and optimizing sophisticated ETL and ELT workflows to handle both structured and semi-structured data, ensuring that high-performance analytics are available for business decision-making. Working within an agile environment, the role demands a expert-level understanding of data processing jobs using Python and PySpark.

In addition to pipeline construction, the engineer will be responsible for integrating and managing data within the Snowflake cloud data warehouse. This includes writing complex SQL queries for data transformation and validation, as well as supporting Power BI dashboards by delivering curated, analytics-ready datasets. Candidates must demonstrate a strong commitment to data quality, governance, performance, and security best practices. This role is offered on a W2 basis and is ideal for individuals with prior experience in the financial services or banking domain who are looking to apply their technical leadership in a dynamic corporate environment.

Key Requirements

Minimum of 10 years of professional experience in Data Engineering or a related field. Advanced proficiency in SQL, including complex querying and performance tuning. Extensive experience designing and maintaining scalable data pipelines on AWS. Expert knowledge of Python and PySpark for large-scale data processing. Hands-on experience with Snowflake cloud data warehouse management and integration. Proven ability to develop and optimize ETL/ELT workflows for various data formats. Experience supporting Power BI through data modeling and performance optimization. Familiarity with AWS services such as S3, Glue, EMR, Lambda, and Redshift. Strong understanding of data quality frameworks, governance, and security best practices. Ability to work effectively in an Agile/Scrum environment with cross-functional teams.
Similar Jobs

JAVA TEAM LEAD @ COGENT CUBE

0 Negotiable or Not Mentioned USA, Remote 23 days ago cogentcube.com 1180 Views

Cogent Cube is seeking a highly skilled Java Team Lead to drive the technical direction and execution of our backend systems. This role is pivotal in managing distributed environments using Java and Spring Boot. You will lead by example, mentoring junior developers and ensuring that the team adheres to best practices in microservices architecture, service orchestration, and RESTful API development. Your leadership will be crucial in maintaining a collaborative environment through pair programming and shared engineering standards.

The successful candidate will demonstrate extensive experience with Apache Kafka and distributed systems concepts. You will be expected to design and implement sophisticated automated testing frameworks and drive the automation of manual workflows. Proficiency in cloud-native deployments across AWS, Azure, or Google Cloud Platform is required, alongside a strong grasp of DevOps practices involving Docker and Kubernetes. Additionally, you will utilize observability tools such as Splunk, Dynatrace, Prometheus, and Grafana to monitor performance and reliability across the stack. This position offers flexible work arrangements including onsite, hybrid, or remote options.

Key Requirements

15+ years of professional experience in backend development using Java and Spring Boot. Proven expertise in architecting microservices and RESTful API service orchestration. Deep hands-on experience with Apache Kafka and distributed systems concepts. Demonstrated history of engineering team leadership, mentoring, and technical guidance. Experience with collaborative development practices including pair programming. Extensive experience designing and implementing automated testing frameworks (Unit, Integration, Contract, E2E). Strong track record of automating manual workflows and internal processes. Familiarity with AI-assisted development tools to improve productivity. Proficiency in DevOps practices, including Docker, Kubernetes, and cloud-native deployments. Hands-on experience with major cloud providers such as AWS, Azure, or Google Cloud Platform. Experience with observability and monitoring tools like Prometheus, Grafana, Dynatrace, or Splunk. Knowledge of batch and data processing tools such as Apache Spark or Spring Batch.
Similar Jobs
BTR Ultra Seeker

1000+ Users Trust Ultra Seeker: Apply Smarter, Faster

Join our elite job-seeking community. Enjoy 50 applications daily, auto-submit to matching jobs, and 10 AI letters. Get priority visibility and apply early to new listings.

Starting $1.99/mo Fast Hire Boost

CAPA COORDINATOR / FACILITATOR @ STARK PHARMA

~8,000 Mentioned USA 22 days ago starkpharma.com 1436 Views

Join a leading medical device manufacturer as a CAPA Coordinator / Facilitator for a 12-month engagement. In this high-impact position, you will drive CAPA activities tied specifically to FDA audit findings, focusing on quality excellence and cross-functional collaboration. You will be responsible for leading activities from initiation to closure, including driving root cause analysis using methodologies like 5 Whys and Fishbone diagrams. Possible work locations for this role include Minnesota and California. The pay rate for this position is $50/hr (W2).

As a facilitator, you will lead discussions, ensure team accountability, and coach others on CAPA best practices and compliance. You will partner closely with Quality, Engineering, and Operations teams to improve processes, tools, and reporting. Candidates should have over 5 years of experience in the Medical Device industry and a proactive, solution-driven mindset. Experience with process monitoring and production-related CAPAs is highly desirable for success in this quality-driven engineering environment.

Key Requirements

Minimum of 5 years experience in the Medical Device industry. Proven track record in CAPA coordination and coaching teams. Expertise in Root Cause Analysis methodologies such as 5 Whys and Fishbone. Strong background in Quality Engineering or a related technical field. Ability to lead CAPA activities from initial discovery to final closure. Proficiency in facilitating cross-functional discussions and driving accountability. Experience working with FDA audit findings and compliance standards. Ability to partner effectively with Quality, Engineering, and Operations teams. Skills in improving CAPA processes, tools, and documentation reporting. Experience with process monitoring and production-related CAPAs is preferred.
Similar Jobs