Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Coexpression has O(n^3) ("cubic") time complexity and O(n^2) ("quadratic") space complexity, where n is the number of probes in the dataset.  The time complexity is due primarily due to the TOM computation.  The space complexity is due to the need to hold the nXn correlation and TOM matrices in memory.  The R language has an inherent limit on the size of a vector or matrix of about 2 billion elements (http://stat.ethz.ch/R-manual/R-devel/library/base/html/Memory-limits.html).  For a square matrix, this This means the maximum is size of a square matrix is 46340 x 46340.  A square matrix in R requires 8n^2 bytes.  (Values are double precision and R has no provision for single precision representation of floating point values.)  Such a maximum matrix requires 16.78 GB of contiguous memory.   The coexpression package uses two nXn matrices (as mentioned above) and at times creates temporary variables at sizes equal to or a large fraction of the nXn matrix size.  Therefore, to support maximum dataset sizes, machines used to run coexpression should have dozens of GB of memory.  In the empirical evaluations summarized below, we use Amazon Web Services' "High-Memory Quadruple Extra Large"  machines having 68 GB RAM, each of which costs (at the time this is written) $2.88/hour (http://aws.amazon.com/ec2/instance-types/). 

...