One of largest genetic testing companies in the US is leading a streamlined operation to produce large volumes of genomic data for pharma and DTC customers with diverse needs. Their goal is to be extremely operationally and technically efficient. In order to achieve this, they needed to massively scale production volumes while maintaining data quality and workflow integrity. Additionally, they were looking to expand their technology to include the ability to diversify data types in order to customize output to different industry needs such as using whole genome sequencing.
- Current workflows do not support large scale production volumes (>10000 samples a day)
- Internally increasing production and building a system to support this objective is a computationally demanding and complex operation
- There are several potential bottlenecks in both processing, handling and storing of generated data
- Limited resources to develop technology to enrich genotype datasets and provide more data for downstream analysis
BC Platforms worked with this client to create an automated, streamlined and efficient process to handle extremely large volumes of heterogeneous array types while simultaneously increasing efficiency of genomic data storage. This approach applies cutting edge technology to form a novel, scalable, and automated system that helps eliminate bottlenecks along extensive data processing workflows.
- Easy access to large amounts of data in a tiled dataset format
- Customized storage for convenience in analysis off the cloud or on premise
- Extremely fast data processing and reference without decompression