Sections

Data Centre options for Pipi 9

Mike's Notes

I'm writing down my evolving thoughts about a future Pipi 9 data centre. This is to enable suppliers, Alex, and others to give me feedback.

Resources

References

  • Reference

Repository

  • Home > Ajabbi Research > Library > 
  • Home > Handbook > Company > Values

Last Updated

27/04/2025

Data Centre options for Pipi 9

By: Mike Peters
On a Sandy Beach: 26/04/2025

Mike is the inventor and architect of Pipi and the founder of Ajabbi.

Pipi 4 in 2007

Pipi 4 used three ModemPak 45u rack cabinets to mount the servers. There was a separate 6 kw UPS. Here is a grainy photo from 2007, when it hosted the 17th most popular website in NZ, which received 30,000 unique visitors and 750,000 page views per month. It included servers for ESRI GIS, databases, applications, web hosting, updates, backups, security, rendering and more, with a lot of redundancy. It was never hacked due to the platform team led by Dave Evans. A report at the time to the Department of Conservation (DOC), written by Turing Solutions, a New Zealand government contractor, stated that Pipi 4 was worth NZ$3 million.

The first Pipi 9 rack cabinet

I am building out the first 45u rack cabinet. It was previously used for Pipi 4 and was in storage for 10 years. I'm having to replace the keyed barrel locks and get some rackstuds.

This temporary rack will be equipped with older servers for testing Pipi 9 and to start rendering open-source modular applications to send to GitHub. Older servers work fine, but the rendering takes longer.

I will then replace these servers over the coming months with better second-hand ones to speed up the rendering.

I'm really enjoying this job. If you've ever had Mechano, you'll understand. I also have to design and build some specialist electronic gear to put in the rack. Even more fun.

It's also a significant milestone reached in the Pipi 9 build. This will be sufficient till Ajabbi scales.

The first Data Centre

Once scaling begins, Pipi 9 will then need a data centre to use as a render farm to automatically create customised SaaS enterprise applications based on user requirements. The data centre will be completely isolated from the internet to maximise security. It can be expanded in stages if it is planned appropriately.

Each industry and each enterprise customer will get a dedicated server to store a mirrored copy of their deployment configuration and parameters, including localisation. No user data will be stored.

This enables customised updates to be created. For example, importing the latest version of Snowmed or IATA.

Colocated Staging Data Centre

The built application then needs to be copied to disk and deployed to a small, caged, colocated data centre with high security, reliability, and connectivity as a staging point for worldwide deployment. One option could be to use the proposed DataGrid Data Centre, set to open in Southland, NZ, by 2028, with high-speed undersea cables.

Pipi 9 is designed to function as a swarm, with complete self-managing automation at every stage.

The emphasis is 100% reliability and security with lots of redundancy. See the Ajabbi Handbook reference for more initial information.

Production in the Cloud

The built application then needs to be deployed to a customer-chosen cloud provider so that the customer can use the enterprise application in production. Regular updates and system changes would be rapidly provided by Pipi 9 in hours, not weeks or months.

Data Centre equipment

I really like the wide range of cabinet options that Rittal provides. My preference would be to use a standard 800mm wide 19 rack that could be flexibly used for;

  • Power
  • Cooling
  • Servers
  • Networking
  • Rack automation and monitoring

Other issues;

  • Determine the correct cabinet model to use
  • Use Open19 rather than OPC.
  • Use a cold aisle
  • Security doors
  • 24 hr monitored access

Looking good

I also like the look of the Eaton racks, probably because I like black racks. But maybe Rittal could provide them in black as well, on special order from their manufacturing facility across the ditch in Australia. Or else I could have them spray-painted and paint-oven-baked locally. The Data Centre needs to be cool and look cool, and I don't like messy network cabling. Hey, I'm an artist too.

No doubt these ideas will evolve.

2 comments: