
Overbyte’s Walker agrees, saying that almost all the snafus his team ran into during the online school deconstruction project were not technical, but came down to visibility. “At some point, you have to confront unknown systems; things with incomplete or outdated documentation,” he says. “We had moments where, after beginning to deprovision systems, stakeholders surfaced saying, ‘Wait, that’s still in use.’”
Dismantling systems is not the end
PPG experienced no disruptions during the dismantling process, Ramachandran says, other than some tactical delays and contracts that needed updating.
“There were some learnings on the network side because networking can get complex,” he says. “Sometimes, we extended the outage windows” to up to five hours, for example. Those were the hiccups.”
From start to finish, the decommissioning process of all eight data centers took about three years. “The end is not migrating all the workloads. The end is actually shutting down the data center,” Ramachandran stresses. This requires deconstructing the power, the cooling, fire systems, and multiple generators used for backup, which had to be removed by helicopter.
“You have to take the diesel fuel out and dispose it off and sell it. We have to get recertification of the building for safety, because this is a building where you had kilowatts of power coming in, which basically [also] went through a deconstruction process,” he says. “So you have to get a safety certification … all of this takes time because we have to give the building back to the building management the way they gave it to us.”
What data center deconstruction buys you
The painstaking data center deconstruction process has given Ramachandran valuable insight. “Make sure your best people spend time creating value for the business, as opposed to babysitting infrastructure,” he says, because infrastructure no longer adds value.
“You also do a lot of inherent risk management by getting rid of data centers and moving to a cloud environment you don’t have to worry about,” he adds. Noting the current state of the economy, Ramachandran says coping with sudden price increases for memory and chips is no longer stressful since they aren’t buying infrastructure.
“You’re basically giving back working capital to the company, because you’re moving the organization from a fixed capital environment to your variable cost model completely,” he says, “and you don’t have to refresh your hardware every four or five years.”
Cost was never the objective for the data center deconstruction, Ramachandran notes. “Nonetheless, when we did the business case, we said it’s not going to cost us any more or any less, but will buy us better security, better flexibility, better agility for the organization,” as well as better focus and technology. “And we achieved all of those.”
The value is in all those other areas. “We are not data center operators. The team is now focused on delivering applications that are meaningful to the business,” Ramachandran says. “The team is much closer than ever to the business because we are not talking infrastructure but how to make the business better.”
Walker says companies should measure twice, cut once. “Most teams want to jump straight into migration,” he says, “but the real work is building a complete inventory and mapping dependencies upfront.”
While it made sense for PPG to modernize some apps at the same time as the data center deconstruction work, Walker advises IT leaders to resist the urge to do everything at once. “Focus on moving what you understand first, and isolate the unknowns early,” he says.
“The success of these projects is usually determined by how well you handle the edge cases, not the easy wins.”
Any new technological development IT can make without interrupting operations dramatically reduces time to market, Ramachandran says.
Working on the latest technologies makes IT happy, and that helps with talent retention, he adds, “because we can say we’re cloud only, so this 143-year-old company looks modern. That is meaningful in so many ways.”

