Technical Architect-Hadoop

Good knowledge of database structures, theories, principles, and practicesExperience in JavaKnowledge in Hadoop (HDFS and MapReduce) conceptsAbility to write MapReduce jobsProven understanding with Hadoop, HBase, Hive, Pig, and HBaseAbility to write Pig Latin scriptsHands on experience in HiveQLFamiliarity with data loading tools like Flume, SqoopKnowledge of workflow/schedulers like OozieGood aptitude in multi-threading and concurrency conceptLoading data from disparate data source setsCertifications like Cloudera Developer/Administrator Certifications added advantageHands-od experience with at least two NO SQL databasesAbility to analyze, identify issues with existing cluster, suggest architectural design changesAbility/Knowledge to implement Data Governance in Hadoop clustersExperience with Hortonworks Data Platform, having Java backgroundStrong understanding of underlying Hadoop concepts and distributed computingStrong skills writing Map Reduce Expertise with Hive Experience working with Big Data on Amazon Web Services Experience with Redshift, Elastic Map Reduce, S3 on AWSCustomer facing skills, responsible for deliverables, schedule and effort managementAbility to Lead/Manage a teamCreating Requirement Analysis and choosing the platformDesigning he technical architecture and application designDeploying the proposed Hadoop solutionPlayed pivotal roles as an engineer and architect across domainsExperience with machine learning algorithms and data mining techniques.Analytical and problem solving skills, applied to Big Data domainBe very comfortable with Agile methodologies in order to be able to arrive at difficult engineering decisions quicklyTo be able to clearly articulate pros and cons of various technologies and platformsTo be able to document use cases, solutions and recommendationsGood to have knowledge of web Analytics and exposure working with data sources clickstream data etcPre-sales experience would be added advantage