24 Jun 2019

Senior Hadoop Developer – US

Job Description

As a Senior Hadoop Developer in Bridgetree, you will help setting up Big Data environments for Bridgetree clients. You will be responsible for our readiness, plan and execute a proof of concepts, and lead an implementation plan to success.

To be successful in this role, you will have hands-on experience with Big Data platforms associated with enterprise scale applications and systems. Your live your passion for leveraging emerging technologies and next generation architectures to solve business problems. You will need the ability to teach, mentor, and take charge of a hands-on delivery team as well as support the sales team on potential business opportunities.


  • Experience Architecting Big Data platforms using Azure Hadoop (Hortonworks) Cloud Platform.
  • Strong knowledge of cloud data processing architectures, understanding of distributed systems architecture leveraging Big Data (HDFS).
  • In-depth knowledge of popular database and data warehouse technologies from Microsoft, Amazon and/or Google (Big Data & Conventional RDBMS), Microsoft Azure SQL Data Warehouse,
  • Be fluent in at least 2 object oriented language, preferably java and python, and have familiarity with functional languages and be proficient in SQL.
  • Experience with Hive or Impala.
  • Experience with major distributed processing in clusters using frameworks like YARN, Spark
  • Experience with Kafka and Spark Streaming is a plus but not mandatory
  • Knowledge in Teradata, Redshift, BigQuery and Snowflake is a plus


  • Ability to work with software engineering teams and understand complex development systems and patterns.
  • Designing, building, installing, configuring and supporting Hadoop on Azure.
  • Ability to load from disparate data sources
  • Pre-processing using HIVE or PIG
  • Perform analysis of big data stores, produce insights
  • Ability to write complex high speed queries
  • Translate robust functional/technical requirements into detail design and modeling
  • Be a part of POC teams to build new Hadoop clusters for different clients
  • Test POC’s and help in operationalizing the project
  • Follow Hadoop Best Practices/Standards and document
  • Maintain data security and privacy of Big Data
  • Ability to create SQL DataMart’s from Hadoop

Years of Experience: Overall 8-10 years with at least 3 years in Hadoop

Duration: 6 Months (Possible Extension)

Location: USA (Remote)

Bridgetree, headquartered in Fort Mill, SC, is a 23-year-old company with an established customer base. Our services include data management, analytics, web-based and logistics applications for consumer and business marketers across a variety of industries. We provide quantitative data management support as well as measurement, tracking, and reporting that influence and enhance the marketing strategies.

All applicants are subject to background checks.

apply to:  JobsUS@bridgetree.com

Leave a comment
Other Positions