Cloudera optimization for HDFS/Spark environment -- 2
$30-250 USD
Cerrado
Publicado hace casi 8 años
$30-250 USD
Pagado a la entrega
Need to optimize performance through configuration of server environment resource utilization (mem and CPU) on nodes for a recently installed environment utilizing cloudera 5.5.
We are a group of Data Scientists based in Bangalore. Our core areas of expertise are big data and machine learning.
Can assist you in Hadoop configuration, cluster optimisation and later on in implementing complex projects.
Hello
I have experience of HDFS.
I've build a search system using solr, hdfs(hadoop) and nutch.
Solr for indexing and searching
hadoop for hdfs and mapreduce
nutch for crawling.
I am a Cloudera and Map R certified hadoop administrator with good knowledge in Linux. Please let me know how to proceed further on this.
I do have experience in hadooop cluster security, optimization and sizing. I can do this task for you.