The microservice architecture wouldn't work because the simulator induced exponential complexity. It was going to be a big scaling problem. I had a good think and decided to go into the deep end. Instead of trying to reduce complexity - why not maximise it and then harness it. Just like in nature.
The key to the puzzle was introducing global namespaces and changing the architecture so any messaging could be 100% reliable.
For security, I tried deeply nested microservices. So good so far.
Then, I introduced Algorithms and Markov Chains, fuzzy logic and noise injection.
PostgreSQL became the production database. Thousands of parameters were needed.
In all this, the self-documentation broke, so I was working completely blind. Pipi was headless, and there was still no API, so I only knew it was working by looking at the logs.
Rich primitives based on the objective reality of emergent layers replaced some entities. Each was a software program and ultimately derived from the basic laws of physics and replicable science.
I finished up before Christmas.
Wikipedia became my friend. I had to take a deep dive into algorithms. And it is amazing what you can find by googling. Every day I was often watching a conference talk from QCON.
But Wait There's More
It worked but no one could use it. The headless state of Pipi needed fixing.