Pipi 9 (2023-

Mikes Notes

By the end of last year (2022), Pipi 8 was ready for another architectural transformation. The successful adoption of global namespaces meant that any internal structure became possible.

Big chunks of the internal engines were then aggregated into complex systems. Some had up to 20 engines, all operating independently and interacting. These systems were then combined into more extensive systems, etc. So far, there are 300+ engines, and Pipi 9 has emerged.


Pipi was still headless, and the self-documentation engine was broken when Pipi 8 was built out of 7. The only way I could test 8 was by using batch files and reading the logs.

It was time to fix the system's self-documentation. The CMS was rebuilt, and templates were added to automatically create a series of linked Wiki (like Wikipedia) articles and a set of classically framed "PipiDocs" (a bit like JavaDocs). 

The next stage enables direct editing of the parameters of the underlying systems that the Wiki and Docs pages then self-document. A reverse loop of sorts.

1200 static Wiki web pages were rendered in 7 seconds. I was surprised by how fast it was. The Wiki and Docs are also cross-linked.  

My friend Alex Shkotin from the Ontolog Forum helped me develop good self-documentation templates.


Next, put a User Interface (UI) on the front end.  The problem here to solve was creating a UI generator based on User Interface Description Language (UIDL).

The CAMELEON References Framework (CRF) was chosen because of its rich history in EU-sponsored research. Much of it was led by Professor Jean Vanderdonckt in Belgium. I specifically used usiXML but intend to grab good ideas from XUL, XAML, WasabiXML, etc.

CRF has three layers: Abstract UI, Concrete UI and Final UI. I added two more layers: a Locale UI and a Personalisation UI. CRF was initially meant to run both ways. I made it generative only, driven by DDD workflows. 

Now, the UI generator process is;
DDD > Wfl > AUI > CUI > FUI > LUI > PUI
It sounds complicated but is robust, efficient and runs less often.

Testing of the UI by volunteers with Braille Devices for the Deaf-Blind, Screen Readers, Irlen's, Colour Blindness, Dyslexia, Epilepsy, etc., is underway.


  • A Complex Adaptive System emerges capable of learning, evolving and replicating.
  • Primitives also linked to the layer just below any top ontology
  • Power law behaviours and fractal patterns were observed in logs.
  • BORO in reverse feeds ontologies to entity engine
  • Top-level ontologies become subjective world-views
  • UIDL (Chameleon) rendered UI with workflows, accessibility and i18n
  • Design tokens
  • Self-documentation fixed
  • 4Dism used for space-time
  • CMS rebuilt to enable user customisation
  • Workflow patterns
  • Generic digital twin
  • Rules as code
  • The server is "trained" to produce SAAS applications
  • Optimised for DevOps teams and Mission Control
  • Automated self-tests using feature flags and canaries.
  • Embedded databases
  • All human languages (269-3) and writing scripts (Unicode) can now be used by UI.
  • Testing by volunteers begins

No comments:

Post a Comment