Aug 11, 2018

Big Data Developer - Hadoop

  • Bank Of America
  • Newark, DE, USA
Full-Time Engineering

Job Description

Job Description:

Participates in design, development and implementation of architectural deliverables, to include components of the assessment and optimization of system design and review of user requirements. Contributes to the determination of technical and operational feasibility of solutions. Develops prototypes of the system design and works with database, operations, technical support and other IT areas as appropriate throughout development and implementation processes. May lead multiple projects with competing deadlines. Serves as a fully seasoned/proficient technical resource; provides tech knowledge and capabilities as team member and individual contributor. Will not have direct reports but will influence and direct activities of a team related to special initiatives or operations, as well as mentor junior band 5 Architect 1's. Provides input on staffing, budget and personnel. Typically 7 or more years of architecture experience.

Position Summary
Mandatory Technical Skills:
  • Extensive knowledge of Hadoop stack and storage technologies HDFS, MapReduce, Yarn, HIVE, sqoop, Impala , spark, flume, kafka and oozie
  • Extensive Knowledge on Bigdata Enterprise architecture (Cloudera preferred)
  • Experiance in No SQL Technologies ( Cassandra, Hbase )

Required Qualifications & Experience:
  • Bachelor's degree in Science or Engineering
  • 7+ year of Industry experience.
  • Minimum 3+ years of Big Data experience
  • Develop Big Data Strategy and Roadmap for the Enterprise
  • Experience in Capacity Planning, Cluster Designing and Deployment
  • Benchmark systems, analyze system bottlenecks, and propose solutions to eliminate them
  • Develop highly scalable and extensible Big Data platform, which enables collection, storage, modeling, and analysis of massive data sets from numerous channels
  • Continuously evaluate new technologies, innovate and deliver solution for business critical applications

Good To have
  • Experience in Real time streaming ( Kafka )
  • Experience with Big Data Analytics & Business Intelligence and Industry standard tools integrated with Hadoop ecosystem. ( R , Python )
  • Visual Analytics Tools knowledge ( Tableau )
  • Data Integration, Data Security on Hadoop ecosystem. ( Kerberos )
  • Awareness or experience with Data Lake with Cloudera ecosystem



Security Clearance

NO Security Clearance

Job Tracking URL;213169268;z?

Apply Now