top of page

Designing Data Centers in the Age of AI: Power. Proximity. Real places.

  • a few seconds ago
  • 6 min read

Updated: 11 hours ago



I’ve been thinking a lot about how the data center conversation has shifted over the last few years. Not just in terms of scale, but in terms of pressure. Everything feels tighter now. Timelines, power, approvals, expectations. AI did not just increase demand. It exposed how brittle some of our old assumptions really were. What we are dealing with today is not just a faster version of the same work. It is a different kind of work.



AI Turns Infrastructure into the Main Event


AI is often framed as a software or hardware problem, but in practice, it shows up first as an infrastructure problem. Power density, cooling strategies, network paths, physical layout. These decisions are no longer secondary. They define whether a project works at all. What I find interesting is that while training workloads still give us some geographic flexibility, inference does not. The moment latency matters, everything changes. Where you build matters. How traffic moves matters. What used to be acceptable network behavior suddenly is not. That shift is still underappreciated in a lot of planning conversations.



Generic White Space Is Harder to Justify Than We Admit


I have not personally had a project forced into a major redesign late in the process due to AI requirements, but I do see why teams are being far more cautious upfront. The risk of getting the design wrong is simply too high now. We used to lean on standardization as a safety net. Today, flexibility is the safety net. Customers want facilities that can evolve with chip cycles they themselves do not fully control. That means we are often designing for optionality rather than optimization, even if it costs more in the short term. It feels like a necessary trade.



Power Is Still the Hardest Problem, Especially Behind the Meter


Power remains the dominant constraint in almost every serious project discussion I am involved in. Not just access to power, but how it is delivered, how it is staged, and who owns the risk. On a previous project, behind-the-meter power generation became a real issue. It was not theoretical. It affected schedules, coordination, and ultimately design decisions. When you are forced to solve power problems on site, you quickly realize how interconnected everything becomes. Mechanical layout, permitting, cost, community perception. Nothing stays isolated. Interestingly, I have not seen a project outright fail due to transmission constraints, but I have seen enough close calls to know that it is always lurking in the background.



Cooling and Density Are No Longer Engineering Footnotes


AI has pulled cooling and density decisions out of the back room and into the main design narrative. Even when liquid cooling is not required on day one, it is shaping the building. Floor loads, ceiling heights, mechanical yards, future piping routes. We are designing for systems that may not be installed yet, because nobody wants to retrofit a building that was never meant to support what comes next. There is a certain humility that comes with this. Accepting that whatever we design today is probably not the final answer, and that resilience comes from planning for change, not resisting it.



Inference Is Quietly Redrawing the Network Map


This is where I think the industry is still catching up. For decades, the model was simple. Centralize. Backhaul. Build bigger hubs. That model starts to crack when inference enters the picture. Real-time workloads do not tolerate unnecessary distance, even if the facility on the other end is cheaper or more established. What I am seeing is a renewed focus on proximity. Smaller interconnection points. More intentional fiber routing. Less assumption that traffic will naturally flow back to a handful of major metros. In many cases, the interconnection becomes the anchor, and the data centers follow. That is a meaningful inversion of how a lot of us learned this business.



Smaller and Incremental Builds Are Not Just Safer, They Are Smarter


We recently completed a data center project for a major urban campus. While I can’t share specific details out of respect for confidentiality, it became clear early in the process that a centrally localized approach was both timely and well aligned with how the campus is expected to operate in the years ahead. By reconsidering where physical infrastructure lives and modernizing it in a new setting, the institution is better positioned to adopt emerging technologies that improve latency, strengthen the protection of sensitive research, and bring greater clarity to long‑term capital investment planning.


From an architectural and planning perspective, the solution functioned as a kind of “micro‑hub.” Sensitive intellectual property remains fully within the institution’s physical and jurisdictional control, effectively shifting it from being a tenant of distant technology platforms to the steward of its own digital footprint. That distinction has become increasingly important as data, research, and compute grow more tightly coupled to institutional mission.


This experience reinforced a broader pattern I’m seeing across the industry: smaller, incremental builds are not just safer, they’re smarter. Large campuses will always have their place, but modular, right‑sized deployments move faster, scale with actual demand, and are easier to align with the needs and concerns of the communities they serve. Starting small isn’t merely an approval strategy. It’s a design strategy that buys time, builds trust, and reduces the risk of getting too far ahead of demonstrated need.



Community Resistance Is a Design Constraint, not a Side Issue


I have been in contentious community meetings before, and not just for data centers. One was for a student housing project. Another was for a more recent data center project. Different asset classes, same underlying dynamic. People do not like surprises. They do not like feeling excluded from decisions that affect their environment. What I have noticed, though, is that developers are learning. Slowly, but genuinely, communication is improving. The industry is getting better at explaining what is being built, why it matters, and how it fits into a longer-term vision for the community. That learning curve took longer than it should have, but it is real. From a design perspective, this means scale, phasing, water use, noise, and even aesthetics matter more than they used to. The physical form of the building becomes part of the conversation: Risk, Resilience, and the Shape of Buildings. Another subtle shift is how risk management and insurance influence design. As projects grow in size and complexity, redundancy, separation, and compartmentalization start to shape campus layouts in very real ways. This is not just about code compliance. It is about making projects insurable, financeable, and operable over decades. That pressure shows up in spacing, system segregation, and how much infrastructure you concentrate in a single structure. It all feeds back into design.



Designing for Long Lives in a Short-Cycle World


What keeps me optimistic is that some parts of this industry are still incredibly durable. Real estate in the right location. Fiber routes. Interconnection. These elements outlast chip generations, software platforms, and market hype. Machines will keep changing, the places where they connect will not change nearly as fast. The best projects I see today are the ones that accept that mismatch. They focus on flexibility, resilience, and long-term relevance rather than chasing a perfect snapshot in time.



Final Thought


AI has not just raised demand. It has raised expectations. We are being asked to move faster, design smarter, and engage more thoughtfully with power providers, communities, and customers. That is uncomfortable at times, but it is also forcing the industry to mature. The future of data center design is not just about scale. It is about alignment. With infrastructure realities, with local context, and with the fact that these buildings are meant to last far longer than the technology they house. That feels like a challenge worth leaning into.



About the Author


Frederick is an award‑winning architect and planner with over 35 years of architectural design experience and nearly 20 years in the data center sector, bringing an international portfolio that spans large‑scale facilities and diverse project types. His background across global markets gives him a broad, practical lens that shapes the insights he shares.


Throughout this series, he closely follows the evolution of the data center market, drawing on industry events, newsletters, and ongoing dialogue with fellow experts to distill what’s worth paying attention to and why it matters.

Comments


DC INSIGHTS ORG LOGO TEXT TRANSPARENT LAVANDER.png

admin.dcinsights.org

Washington DC

  • Facebook
  • Instagram
  • X
  • TikTok

Connect with Us

 

© 2026 by dcinsights.org. Powered and secured by Wix 

 

bottom of page