AWS Public Sector Blog
Tag: research
UC Davis CWEE accelerates water conservation research with secure, compliant data storage on AWS
To solve some of the most pressing water and energy challenges, scientists and engineers need access to robust, reliable data that is often sensitive and protected. Data providers, researchers, and host institutions need to adhere to strict requirements for protecting and securing this data. The Center for Water-Energy Efficiency (CWEE) at the University of California, Davis (UC Davis) used AWS to create a centralized, secure data repository that streamlines data sharing.
The Institute of Human Virology Nigeria reduced costs by 64% by migrating hundreds of mailboxes to Amazon WorkMail
The Institute of Human Virology Nigeria (IHVN) uses email as their main communication method, with over 800 email accounts using up to 50GB of storage space each. IHVN used AWS to migrate their email to Amazon WorkMail to reduce the cost of each mailbox by 64 percent, allowing their IT team to easily manage the corporate email infrastructure and get enterprise grade security.
Data egress waiver available for eligible researchers and institutions
The Global Data Egress Waiver (GDEW) program helps eligible researchers and academic institutions use AWS cloud storage, computing, and database services by waiving data egress fees. GDEW can be a valuable tool that gives eligible researchers and institutions a more predictable budget, which in turns allows them to have more direct access to the cloud than they might otherwise. Find out if your team is eligible to take advantage of the data egress waiver program.
Driving innovation in single-cell analysis on AWS
Computational biology is undergoing a revolution. However, the analysis of single cells is a hard problem to solve. Standard statistical techniques used in genomic analysis fail to capture the complexity present in single-cell datasets. Open Problems in Single-Cell Analysis is a community-driven effort using AWS to drive the development of novel methods that leverage the power of single-cell data.
Paris-Saclay University uses AWS to advance data science through collaborative challenges
This is a guest post by Maria Teleńczuk, research engineer at the Paris-Saclay Center for Data Science (CDS), and Alexandre Gramfort, senior research scientist at INRIA, the French National Institute for Research in Digital Science and Technology. Maria and Alexandre explain how they adapted their open source data challenge platform RAMP to train the models submitted by student challenge participants using Amazon Elastic Compute Cloud (Amazon EC2) Spot instances, and how they leveraged AWS to support three student challenges.
Assessing the ocean’s health by monitoring shark populations
OCEARCH is a data-centric organization built to help scientists collect previously unattainable data about the ocean. Their mission is to accelerate the ocean’s return to balance and abundance, through innovation in scientific research, education, outreach, and policy, using unique collaborations of individuals and organizations in the US and abroad. As part of the Amazon Sustainability Data Initiative (ASDI), we invited Fernanda Ubatuba, president and COO at OCEARCH, to share how her organization is making strides in helping ocean conservation and how AWS is supporting her mission.
Using big data to help governments make better policy decisions
In Europe, government agencies and policy makers see the value in using new technology to unlock digital transformation and deliver better, more innovative citizen services. Using data for statistics initiatives, including open data, can help researchers produce innovative products and tools, including visualisation, to inform government officials ahead of making policy decisions that impact their citizens. When it comes to big data, policy makers need to collaborate with researchers to address issues and challenges in using these new data sources. To work toward this goal, Eurostat, the statistical office of the EU, hosted its bi-annual European Big Data Hackathon.
Accelerating genome assembly with AWS Graviton2
One of the biggest scientific achievements of the twenty-first century was the completion of the Human Genome Project and the publication of a draft human genome. The project took over 13 years to complete and remains one of the largest private-public international collaborations ever. Advances since in sequencing technologies, computational hardware, and novel algorithms reduced the time it takes to produce a human genome assembly to only a few days, at a fraction of the cost. This made using the human genome draft for precision and personalized medicine more achievable. In this blog, we demonstrate how to do a genome assembly in the cloud in a cost-efficient manner using ARM-based AWS Graviton2 instances.
Modeling clouds in the cloud for air pollution planning: 3 tips from LADCO on using HPC
In the spring of 2019, environmental modelers at the Lake Michigan Air Directors Consortium (LADCO) had a new problem to solve. Emerging research on air pollution along the shores of the Great Lakes in the United States showed that to properly simulate the pollution episodes in the region we needed to apply our models at a finer spatial granularity than the computational capacity of our in-house HPC cluster could handle. The LADCO modelers turned to AWS ParallelCluster to access the HPC resources needed to do this modeling faster and scale for our member states.
Inside a self-service cloud research computing platform: How RONIN is built on AWS
RONIN is an AWS Partner solution that empowers researchers with a simple interface to create and control computing resources, set and monitor budgets, and forecast spend. RONIN is designed and architected to advance research institutions’ missions, by providing a research platform that manages the most common research use cases, and is also compatible with advanced cloud computing services from AWS. Learn what powers RONIN underneath the user-friendly interface.