P3.0 Analysis

stock market analysis

Elegant Reasonism Holistic Analysis

Once specific insights are developed from the previous two process steps or phases investigators will find they have collected considerable insights; however, when we back up from that body of work and consider the Translation Matrices holistically our experience suggests that there are insights visible at this level available in no other manner. It was this level of analysis from our original systems review that directly led to The Emergence Model and unification. Everytime we tested net new science against this approach it dovetailed with existing Elegant Reasonism investigations made manifest by M5.

P3.000 Holistic Analysis

The objective with the analysis phase is to successfully build out evidence chains down through the analytical stack below and which successfully navigate the Process Decision Checkpoint Flowchart from upper left to lower rights all holistically in support of a Treatise that reflects the unified Universe.

Elegant Reasonism Generalized Process Flow
Elegant Reasonism Generalized Process Flow (open in a new tab for larger view)

Elegant Reasonism analysis in reality occurs in every phase of the process including Recognition and Illumination, but here the earlier phases are reviewed and then specifically approached from a holistic analytical precipice. The intent is to review the assumptions of each Encapsulated Interpretive Model (EIM) for context alignment and affinity and then the entire body of work holistically in preparation for development of the treatise. Ultimately this entire effort culminates with the final Elegant Reasonism based Treatise.

Process Decision Checkpoint Flowchart

P3.010 Plurality of EIMs in Juxtaposition

The word ‘layer’ implies an epicenter not a 2D construct. The point being that some of these epicenters span the matrices and others do not. Some are vertical, some horizontal, and others follow patterns for recognition purposes across the plurality of Encapsulated Interpretive Models (EIMs). Each ‘layer’ has an intended purpose or mission to fulfill in the grander mission of the particular investigation underway following Elegant Reasonism Rules. Investigators are reminded that they are analyzing the intersecting cells of the 2D Articulation Layer in juxtaposition EIM to EIM relative to and respective of Paradigms of Interest/Nature (POI/N). By this time metrics have been mode shifted and are also available for analysis.

Getting past status quo in order to develop significant insights is no trivial pursuit. Richard Feynman gave this lecture in 1950 on Knowing vs Understanding. Remember this was over 40 years before the International Council on Systems Engineering (INCOSE) was even formed. Elegant Reasonism was designed to help science penetrate these types of issues. We believe that had Feynman, Einstein, and many others on our Acknowledgements page understood modern information sciences and Langer Epistemology Errors that Elegant Reasonism would have been developed decades earlier than it was.



Translation Matrices Analysis Layers

Analysis functions are ‘layered’ against the 2D articulation frame that are the encapsulated, enumerated, interpretive models of the Universe juxtaposed against the paradigms of interest (or of nature). Starting at the top the analysis layers should include but are not limited to these:
It should be noted that each layer is essentially considered a composite structure with the above labels being the predominant preoccupation of each relative and respective layer.  Some are vertically occupied, and others horizontally (e.g. spanning interpretive models), and still others integrate their purpose holistically. The basic purpose of these various layers is to assure the integrity of a particular investigation through analytical rigor, discipline and review of every facet represented.

P3.020 Quality Management Systems (QMS) Standards

P3.030 Six Sigma

Traditional Six Sigma will not function here for several reasons. One is Encapsulation because it creates an impenetrable barrier EIM to EIM. Therefore Sigma Defects are not generally visible to all EIMs and even then some are only visible to the holistic analysis layers. We must then take into consideration how Langer Epistemology Errors (LEEs) have impacted all of the various analytical areas of the investigation. LEEs for example constitute Sigma Defects, as do logic traps, concept compression, etc.

P3.031 EIM Sigma

EIM Sigma values represent the internal integrity of each individual EIM. This value is unique to each EIM.

P3.032 Analytical Layering Sigma

This sigma value is unique to each analytical layer within the Translation Matrices as employed by a given investigation team.

P3.033 Holistic Sigma (relative to the unified Universe)

The holistic sigma value can be calculated individually or taken as a ratio of the other sigma values across your investigation as determined appropriate by the investigative team.

P3.040 Integrity Review

Looking then at the General Process Flowchart above this integrity review is a left to right and top to bottom comparison between where the investigation began and where it ended relative to and respective of objectives and goals originally stated.

P3.041 Langer Epistemology Errors (LEEs) Elimination (e.g. Delta from Left to Right of the Generalized Process Flow)

Identification of LEEs is one area of interest but potentially more insightful might be the implications of having made them in the first place. What areas were obfuscated and which were illuminated? What was missed, and what was gained.

P3.042 Incongruity Reconciliation

An example here we have used before is BX442. Red and Blue shifts are another. Extended evidence chains linkage, potentially through the action principle such as Ludwig von Mises treatise on economics entitled Human Action might be another.

P3.043 Logic Artifact Reconciliation

Logic artifacts are incongruities which can not or can not be easily rationalized by a given EIM. The inability to employ a common geometric basis point is one example. The inability to fully couple reference frames is another. Rationalizing the multiverse in context of these previous examples is yet another example. Rationalization paradoxes are yet further examples. For example, if we ask why the Big Bang banged we were told by members of the WMAP team before they were disbanded that it was caused by “quantum fluctuations in finite regions of space”. The problem with that answer is that the big bang was supposed to have created spacetime. If time did not yet exist nothing could move in order to ‘fluctuate’, etc. General obfuscations and elaborate rationalizations are further examples.

P3.044 Concept Compression Reconciliation

BX442 exemplifies concept compression issues because it is more developed than time since the Big Bang would otherwise allow. Consequently the associated concepts must be compressed into the available time in order to meet rationalization criteria. Hence the name of these types of errors. The Emergence Model reconciled BX442 because the age of the unified Universe grows beyond any need for elaborate explanations as does the size of the area inside our particle horizon. That said there may yet be other issues which arise from a full and comprehensive systems review which has not yet been completed.

P3.050 Systems Engineering Requirements & Insights

These are requirements and insights resulting from application of the Systems Engineering Body of Knowledge (SEBoK).

P3.060 Systems Recognition

Under The Emergence Model if we take individual MBPs as a system (because they can entangle and suffer Severance) then the implication is that everything real is a system or system of systems. This area identifies those systems and describes their behavior either in Treatise or sets up for further study and/or R&D.

P3.070 Integrating Layer Analysis

This area of the investigative analysis steps back from what has been accomplished so far and integrates it all. The cogent description of M5 is a direct example of being able to accomplish such integration. That single paragraph was created in exactly this manner.

P3.080 Standards Based Insights & Analysis

There are insights and analysis that will be delivered because standards based investigations were employed that would not have been developed by other approached. That said it should be recognized that such endeavors are expensive and consume resources and human capital. Proper planning and prioritization are vital.

P3.090 Pattern Recognition & Analysis

We labeled the subtitle for our original systems review: “Discerning Patterns of Earth’s Emergence” exactly because of the pattern recognition which emerged from that effort. Mode shifting patterns from one EIM to another means that patterns also change and relationships may be different. Those details will provide additional insights that may or may not be expected. The ability to measure something and implement metrics will intrigue investigators and pique curiosity for generations to come.

P3.100 Architectures of Mass Recognition & Insights

Architectures of mass represents probably the single largest computing opportunity humanity will ever have. The R&D here is extensive beyond current capabilities. Because of insights already developed we can share that technologies like quantum computing and artificial intelligence need significant overhaul and mode shifting to increase their effectiveness in such an endeavor. Those who have vested interest in status quo would do well to review In Unification’s Wake, Part 05: Business Impact.

P3.110 Tool Results, Recognition, & Insights

This stage of analysis takes into consideration all of the insights gathered from all the various tools both internal and external which were employed by the investigative team and develops subsequent insights from that data and information.

P3.120 Macro Insights

Macro insights are those insights derived directly from the holistic view determined by the investigation. Our original systems review notes represents fodder for such analysis.

P3.130 Establish The Unified Perspective

At this point in the investigation team members are likely giddy, if not gobsmacked, by the insights they have developed. What needs articulating is not nearly as challenging to recognize as the means to communicate effectively beyond the investigative team. That, is where the challenge will be found exactly because all those other people were not there, part of those meetings and discussions. This is where all that data collected to employ Bayesian analytics will earn their keep.

Shop Now

#ElegantReasonism #EmergenceModel #Unification #P3.0Analysis #Analysis #EIM #PDCF
%d bloggers like this: