Go to application form

Hadoop Developer (BH-353068-1)

Australian Capital Territory, Australia

We have a requirement for Hadoop Developer for a federal government client. 

Duration: Till June 2020 with 2 X12 months extensions

Citizenship requirements: Australian Citizen with Baseline Clearance

Role Description:
  • Designing, building, installing, configuring and supporting Hadoop;
  • Implementing data workflows and ETL processes;
  • Participating in design reviews, code reviews, unit testing and integration testing;
  • Identifying opportunities to improve the performance and quality of the platform;
  • Pre-processing using Hive, Pig, Spark, etc;
  • Translating complex functional and technical requirements into detailed design;
  • Performing analysis of vast data stores and uncover insights;
  • Implementing security and data privacy according to design;
  • Creating scalable and high-performance web services for data tracking;
  • Developing, Managing and deploying additional components such as NiFi, Kafka, Zeppelin, Solr;
Essential Criteria:
  • Bachelor's degree in Computer Science or a related field;
  • Minimum 5 years’ working experience in data warehousing and Business Intelligence systems;
  • Demonstrated proficiency in designing and delivering scalable solutions in several of the following:
  1. Hortonworks / Cloudera Data Platform including core technologies such as HDFS & YARN, Hive, HBase, Pig, Sqoop and Phoenix;
  2. Hadoop Security Technologies such as Knox, Ranger and Atlas, as well as related technologies such as Kerberos;
  3. Hortonworks / Cloudera Data Flow Platforms including technologies such as NiFi, Kafka, SAM / Storm;
  4. Additional technologies such as JanusGraph and Solr.
  • Experience with integration of data from multiple data sources;
Desirable Criteria:
  • Experience working within government agencies;
  • Experience developing primarily in one or more of JavaScript, XSLT, HTML and Cascading Style Sheets;
  • Understanding of:
    • Oxygen or other XML/XSLT editing and debugging tools;
    • Accessibility, Usability standards including WCAG, WAI-ARIA & Semantic HTML;
    • Rational ClearCase;
    • Maven;
    • SQL preferably with Oracle or DB2;
  • Experience working in a SOA environment;
  • Previously worked in a big data environment or project.
If you are interested in this role, please APPLY NOW. For more information, please contact Karun on 02 6113 7539 or via email on karun.sharma@igniteco .com. Ref# 353068