The candidate will work with various resources to identity opportunities to leverage big data technologies to better equip business units and their systems with a common set of tools and infrastructure to make analytics efficient and insightful.
Able to develop regular J2EE applications, as well as applications in a Hadoop environment and Scala and in building Machine learning applications
Experience in architecture and implementation of large and highly complex projects using HDFS.
Experienced in developing models with contextual data and proficient in Machine Learning algorithms
Experience with Spark, Python, Scala, Hbase, Kafka, Hive, Cloudera, Hortonworks, Pig, Oozie, Sqoop and Flume
Experience with multithreading and distributed computing
3 to 4 years of experience in advanced statistical techniques including predictive statistical models, segmentation analysis, customer profiling, survey design and analysis, and data mining
Experience working with large datasets in a Big Data environment
Hands-on experience with big data platforms and tools including data ingestion, transformation and delivery in Hadoop ecosystem.
Deep understanding of cloud computing infrastructure and platforms.
Architect highly scalable distributed systems, using different open source tools.
Evaluate new technologies and product research to identify opportunities that impact can accelerate access to data and automate key data flows
Thanks & Regrds
Global Resource Management Inc.