How the National Science Foundation cut its data center in half
Editor's Note: This is part of a series of stories looking into the federal IT space. Stay tuned for a peek into the Nuclear Regulatory Commissions' IT and the need for FITARA's regulations.
Outdated technical infrastructure is the ball and chain of modernization. Before the National Science Foundation's (NSF) digital migration, its expert panelists were restrained by its built-in desktops, leaving remote work unavailable.
Since the NSF's digital migration, modernization afforded panelists the option of working from home or bringing their own devices.
Digital migration restructures the landscape of nearly every aspect of IT and it is up to federal CIOs to delegate its proper execution while adhering to regulations. Cloud-based applications are enabling agencies to downsize data centers and streamline their services.
Cloud-adoption reluctance is most severe in agencies burdened by sensitive data missions or with transaction processing systems, such as the Social Security Administration and Internal Revenue Service (IRS). Fear of security risk stemming from a disruption during a system migration process can make already taxed CIOs hesitant to adopt the cloud.
Still, federal agencies are taking an aggressive approach to modernization. For example, under the guidance of acting CIO Dorothy Aronson, the NSF moved and downsized its entire technical infrastructure.
Aronson has worked in the federal government for more than 20 years and oversees the NSF's 300-person IT staff and its $118 million IT budget. Aronson was with the NSF for about 10 years before inheriting the role following the death of former NSF CIO Amy Northcutt. Aronson oversaw the successful completion of the move from the department's Arlington facility to the new one in Alexandria as part of the NSF digital transformation.
"Because our new data center is so much smaller, we had been creating extra space for some time by virtualizing servers and getting rid of old ones," Aronson said, in an interview with CIO Dive. "We were able to move into a much smaller say, computer room, rather than a big data center."
Birds aren't the only thing migrating these days
The NSF receives about 40,000 scientific research proposals each year and outside expert panelists are invited to review submitted scientific proposals. Prior to NSF's gradual virtualization, panelists had to work onsite at the NSF with computers built into desks.
Since then, the NSF allows panelist to either borrow devices or bring their own, so the agency created several different networks for added security. Inside the new Alexandria facility, the NSF set up an internal and visitor network. Both are wireless but required a configuration to allow a network to recognize device users rather than the device itself.
This is unique to the NSF, according to Aronson. The network configuration similar to that of a hotel's. "If you work here, it doesn't ask you to identify yourself," said Aronson. "And it puts you right onto the internal network, but if you're a panelist it will ask you to identify yourself and enroll."
Gradual cloud adoption prior to the move enabled the NSF to bypass any interruption in customer service. The only time NSF networks were offline was over the July 4 weekend when the remaining hardware was physically forklifted to the new location. After the move, the older equipment remaining in Arlington was decommissioned.
The move, while seemingly daunting, Aronson described as "really exciting." When "we moved stuff from a large space to a small space without customers being impacted, we were thrilled," she said.
Cloud-based functions also enabled the NSF to streamline its accommodations to its selected panelists by "providing conferencing capabilities, video conferencing, so that our panels can be completely virtual," said Aronson. The IT support the modernization generated allowed the NSF to virtually accommodate more remote panel experts thus saving money.
Department heads in the clouds
The federal government is trying to modernize its technical infrastructure. In September, the Senate passed the Modernizing Government Technology Act (GTA) as an amendment to the National Defense Authorization Act, acknowledging the GTA as a proponent to national security.
But before such legislation, the initial cloud-first policy announcement took about five to six years to make and even since then, discomfort with the cloud remains, according to Rick Holgate, a federal IT analyst for Gartner.
Much resistance comes from the business side of IT, forcing CIOs to defend the need for digital migration. "CIOs are leading the way for the organization, helping them get used to this notion in trusting federal workloads to a commercial cloud provider," said Holgate.
Tackling commodity IT, file services or systems and software like SharePoint first enabled a smoother migration of larger data troves for the NSF. The same is true for the National Archives and Records Administration (NARA).
The NARA's CIO Swarnali Haldar told CIO Dive that next summer the Electron Records Archive 2.0 (ERA 2.0) system "will house all current and future electronic Federal records when fully operational" and will be using a " technology stack of open-source software." The NARA already has its National Archives Catalog in the cloud which eases record searches for users.
The International Trade Administration, FCC, Small Business Administration and the General Services Administration are also aggressively adopting cloud-based solutions.
Still, modernization is an evolution and Aronson, like most CIOs are aware of this.
"We're just going to continue to evolve, so that we eventually have almost nothing in our onsite data center," said Aronson. As of now, it's still just a matter of agencies committing to an overall migration.
Follow Samantha Ann Schwartz on Twitter