The AmeriPride Services data center in Minnetonka MN is a showplace, a model for groups planning data center upgrades of their own. It wasn’t always that way. “We knew how to run a data center but not how to upgrade one,” says Data Center Manager Jeff Baken. “Our goal was to create an enterprise-class data processing center without interrupting ongoing operations, and we were looking for a reliable local partner with good relationships with the vendors who would be providing systems and services for the new center. 2NSystems came highly recommended and was prepared to oversee the entire process. They provided a project management team, coordinated all the players and all aspects of the process, and brought the project in on time and within budget.

Project Included

  • 80KW data center with N+1 power and cooling
  • 80kw APC UPS
  • Four APC in-row cooling unites to provide redundant cooling for 90kw of load
  • APC rack
  • APC rack power distribution
  • DCIM monitoring software from APC
  • Electrical and mechanical installation
  • Fire suppression upgrades
  • Wall system to separate data center from staging area
ameripride cs

THE CHALLENGE

“Our existing data center was out of power and out of space, our systems were outdated, cooling was insufficient, and we were averaging two outages a day,” says Baken. “We knew we had to make a change, but the only place to put a new system was where the old one already was. The challenge was having to replace everything while the system was still working.”

THE SOLUTION

“A lot of effort and planning went into the change,” says Baken. “2NSystems worked closely with us and with the vendors to phase in the changes without impacting operations. We went from standalone servers to rack-mounted blade servers and installed hot aisle containment cooling. With careful planning we were able to schedule moves from old servers to new over weekends and evenings.”

THE RESULTS

“The system became more stable right from the start of the process, and we now have zero downtime. By increasing cooling efficiency we’re saving money and energy. And we’ve reduced the system footprint so much that we’ve put up a glass wall and partitioned off part of the space that the old system occupied to a meeting room/work area for technical staff.”