3+ years of professional IT experience in Big data ecosystem and related technologies
Excellent Experience in Hadoop architecture and hands on experience in installing configuring and using Hadoop ecosystem components like Hadoop MapReduce HDFS HBase Hive Sqoop Pig Zookeeper, Flume, Kafka & Spark
Good Knowledge on Hadoop Cluster architecture and monitoring the cluster
In-depth understanding of Data Structure and Algorithms
Experience in managing and reviewing Hadoop log files
Excellent understanding and knowledge of NOSQL databases like MongoDB, HBase
Setting up standards and processes for Hadoop based application design and implementation.
Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems and vice-versa
Experience in integration live streaming using Kafka & Spark systems
Experience in managing Hadoop clusters
Ability to adapt to evolving technology strong sense of responsibility and accomplishment.
Note: You must take this online test to complete your job application. Click the button to take the test now or visit your dashboard to take it later. You can also find a link to this test in your registered email address.