Details
-
Type: extRequest
-
Status: Closed
-
Priority: Major
-
Resolution: Done
-
Component/s: FICHe
-
Labels:None
Description
Dear All,
Medbravo is continuously processing big-data on clinical research on
cancer.
In order to accommodate the increasing Medbravo data processing size
in the resources
provided in COSMOS Global Instance we would kindly ask you to for a larger
quota in HDFS.
Specifically *our current quota of 5 GB *is limiting us both input and
output data size of Medbravo Hadoop MapReduce Jobs. Our current input for
big data processing is now 8.5 GB and in the future it would grow bigger.
Our output is larger as it is composed of input data plus additional
information.
Therefore we are asking you to increase our quota in HDFS. We need *at
least 30 GB* to safely store input + output + possible intermediate data
during processing.
We will really appreciate your favourable response,
We wish you a Merry Christmas and a Prosperous New Year,
Aurelia
–
Aurelia Bustos MD
Cofounder
tel: (+34) 618 453 214
www.medbravo.org
Since January 1st, old domains won't be supported and messages sent to any domain different to @lists.fiware.org will be lost.
Please, send your messages using the new domain (Fiware-fiche-coaching@lists.fiware.org) instead of the old one.
_______________________________________________
Fiware-fiche-coaching mailing list
Fiware-fiche-coaching@lists.fiware.org
https://lists.fiware.org/listinfo/fiware-fiche-coaching
[Created via e-mail received from: Aurelia Bustos <aurelia@medbravo.org>]
Issue Links
- relates to
-
HELP-5605 FIWARE.Request.Tech.Data.BigData-Analysis.Medbravo - Cosmos GE HDFS Quota
- Closed
Activity
- All
- Comments
- History
- Activity
- Transitions