Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Comment: Migrated to Confluence 5.3

...

First Example: Shotgun Stochastic Search

See also Getting Started with RHadoop on Elastic Map Reduce

All hosts in Hadoop cluster must receive:

...

Code Block

generate weights file

mapperFunc <- function(key, value) { # this will run on all the machines on the cluster
   weightsIteration <- key
   weights <- value

   write weight vector to a file on disk

   exec stochastic search binary

   read output files from stochastic search (perhaps upload those files to S3)

   compute p-values, area under the ROC, correlation coefficients...

   keyval(weightsIteration, list(pval=pval,rocArea=rocArea, coeffs=coeffs)
}

mapperInput <- to.dfs(weightsMatrix)
mapperOutput <- mapreduce(input=mapperInput, map=mapperFunc)

stochasticSearchResults <- from.dfs(mapperOutput)

iterate over stochasticSearchResults or we could write a real reducer function too!

...