Paul S Prueitt

Paul S. Prueitt


Professor Prueitt has taught mathematics, physics and computer science courses in the nation's community colleges or in universities or four-year colleges. He has served as Research Professor in Physics at Georgetown University and Research Professor of Computer Science at George Washington University. He has served as Associate Professor or Assistant Professor of Mathematics at HBCUs in Virginia, Tennessee, Alabama and Georgia. Prueitt was co-director of an international research center at Georgetown University (1991-1994). He is a NSF reviewer and Principle Investigator. He served for over a decade as an independent consultant focused on information infrastructure, software platforms and intelligence algorithms. He has consulted on national intelligence software platforms and continues this work under private contracts.

His post Master's training in pure and applied mathematics focused on real analysis, topology and numerical analysis. His PhD, earned in 1988 from The University of Texas at Arlington, was developed using differential and difference equations as models of neural and immunological function. He has over forty-five publications in journals, books or as conference papers.

Motivated by a desire to understand the nature of the American educational crisis, he served for seven years at a Historical Black College or University, or at an open door minority serving institution. He currently teaches mathematics learning support courses, in Atlanta, using a deep learning method. The method uses four steps to bring mathematics learning support students to a college level understanding. The method is motivated from a study of behavioral neuroscience and properties of immune response mechanisms.


rss  subscribe to this author


New Data Analytics Published: January 20, 2015 • Service Technology Magazine Issue LXXXVIII PDF

Introduction and Context

Human knowledge transfer commonly occurs on digital platforms within industry training programs, in schools, and in colleges. A necessary part of digital knowledge based transfer systems is the enumeration of categories, and thus the rise of categorical analytics [REF-2] as an important sub-discipline of data science. Since this sub-domain is not yet well known, we cite Katzan's entire abstract for his 2008 article:

"Categorical analytics is an admixture of computational methods with the express purpose of facilitating the multifaceted process of unstructured decision making. The complex subject is based on consensus theory and includes structured analytics, categories, entropy, and the combination of evidence. The methodology is applicable to a wide range of business, economic, social, political, and strategic decisions. The paper includes an election application to demonstrate the concepts."

Katzan's approach is based on probability and is covered more extensively in his book [REF-3]. Our approach in this article is consistent with Katzan's, but stresses a specific "stratified model" in the development of a set of substructural categories, and various means to identify meaningful compositions of substructural categories.

Our use of digital technology is advancing. A reflection shows academic pursuit of knowledge transfer methods and theories. New funding streams from federal sources have brought about interdisciplinary academic activity, and the problem to be addressed has been clear. Organizations set about to capture, retain and grow relevant human knowledge capital. However, finding an academic home for categorical analytics has been difficult. To understand why, we must look to find root causes.

Within organizational sciences, two consistent drivers have been speed and complexity, both within the enterprise and in the global marketplace. On the speed side, appropriate knowledge transfer anticipates not only speed but also simplicity [REF-4]. One result has been constant pressure to encapsulate knowledge into bite-sized, memorable "nuggets." For example, a champion stock trader might give an apprentice a set of catchy phrases such as, "when volatility is high, we buy; when volatility is low, we go." The apprentice applies the rules and does rather well, that is, until either the rules no longer apply or external conditions change and he or she can't decide which course of action to take.

In this illustration, the underlying sense that created the rule is missing. This missing sense should evoke a reaction. However, this reaction may also be inhibited, and to examine this further we point out that inhibition of what would otherwise be straight forward recall of knowledge often occurs due to an activation of the human frontal lobe system [REF-5]. The question here is about why transfer does not occur.

The inability to represent knowledge is a critical barrier to knowledge transfer, regardless of the settling. This type of failure in transfer knowledge platform may be overcome when one adopts a stratified framework, similar in many respects to what occurs in Service Oriented Computing [REF-6]. Our approach suggests that a stratification of elements, in nature, is ubiquitous [REF-7].

When we use principles found in stratification theory, as occurs implicitly in service oriented design, we enable the production of computable representations of deep structure, and of assembly mechanisms active in various observed behavioral aspects involved in knowledge transfer, including inhibition of learning in cases where this is a primary concern. Machine learning recognizes the importance of an architectural design that identifies and then uses deep structure [REF-8].

A general criticism of statistical approaches, such as Bayes [REF-9], Dempster [REF-10], and Shafer [REF-11], and foundational to Katzan's excellent introduction to categorical analytics; is well known. But most of the current work does not have the underlying sense needed to deal with certain types of novelty. One may identify outliers, but one might make the argument that statistical analytics does not bring sufficient tools to identify causative elements when novel data is present. So for example, for the individual, if we lose our ability to recognize novel situations we lose our ability to properly contextualize.

This issue; e.g., how the human brain system handles novelty, is central to our approach to categorical analytics based on non-statistical procedures. Of course, it should be stated that the two approaches; statistical or categorical, can and should be used together.

What is Deep Structure?

Digital platforms use deep structure in various ways. Knowledge representation in the form of computable ontology defines a representation of organizational deep structure. Categorical analytics is used to create taxonomies of terms and relationships. This information is then encoded in the form of RDF triples; e.g., in the form of a set,

{ < a, r, b > }

where a, b are elements of a finite set of features and r is an element of a set of operators defined as assembly mechanisms. The resulting "symbol system" has the capability to serve as a cognitive aid. Services provided by an architecture are then those that combine, or assemble, compounds, { Ci }. Associated with these formal compounds are categories useful in digitally describing knowledge.

For example, computing systems may be algorithmically linked to measurements about individual learning, while attempting strongly to preserve data security. The resulting technology has the potential to improve educational performance. Our proposals suggest that adaptive assessments, representation systems and recommender systems are best combined when some process is in place that extracts features across multiple instances; resulting in a representation of the categorical invariances in play.

Behavioral neuroscience is clear about some simple facts. Cognitive representations are learned through repetitive cycles of observation and self-directed inquiry. The action-perception cycle [REF-12, REF-13] creates a deep structure to behavior of individuals, or organizations. Of course, deep structure is maintained in human individuals in ways different from how an organization maintains deep structure. But clearly the notion that deep structure exists and is operational in behavior is useful.

Theory about natural deep structure is not easily understood. The conjecture is that deep structure is created and persists in biology and other systems. However, elements of a specific natural object's deep structure are invisible at the level of ordinary behavioral observation. An encoding of features is implemented in deep learning computing architecture, using specific methods such as the extraction of a small set of representations of invariance across multiple instances.

In stratified computing, a separation between organizational scales is created as part of expected architectural realization. As a consequence, deep structure computing has implications arising from its value to applied analytics as well as to basic research. To achieve an advance in the use of deep structure computing, public funding is sought. It is noted that the potential value from deep structure computing has been over shadowed by "ordinary analytics" based solely on statistical methods.

Strongly bound ontology is problematic, for reasons related to definitions and coherence. As an example, the notion of a replicator mechanism is referenced, often poorly, in systems literatures by the term "memetic [REF-14]". Even when poorly developed, these terms allow a scholarly discussion about how the neurology is connected to the behavior of an individual. However, when things are over simplified we miss seeing the value in extracting from data a representation of deep structure and of, computable, services needed to model aggregation of subsets of deep structure elements into compounds.

We consider this issue of over simplification as an indicator of something engrained in our culture. Deep structure is thought of as a cause of things we see. But what would that say about Newtonian Laws? In fact, deep structure is just part of a complex reality producing behavior in living systems. Physical reality is more complex than Newton imagined.

Consistent with stratified theory, our personal set of engrams is what we remember with. This is important because how we experience is not the same as how we remember experiences [REF-15]. Like motor programs used to produce speech, memory engrams are composed to produce our remembered self. The Newtonian simplification misses the value of representing deep structure because an engram is not directly observable. Each engram alone makes no sense when represented alone [REF-16]erential field from a set of axioms.

Social Media

We make a prediction. Social media will develop tools that brings control over analytics to the individual. Our conjecture is that a specific data processing architecture optimally supports the integration of deep learning methods, adaptive assessment, representational theory, recommender systems, and digital handwritten message exchanges. .

This work is interdisciplinary in nature, and will need to have supported; e.g., funded research in computer science, psychology, sociology, and neurological science.

Descriptive Enumeration

Descriptive Enumeration (DE) applies principles borrowed from the axiomatic foundations to mathematics. We here use the following observations. Knowledge about a topic is canonical only if and only if it is orthogonal to other topics and the set of topics is complete. The representation should be minimal in the number of its formal constructions. Representation should also reflect organizational stratification, as observed in nature.

Any set of axioms has similar properties. The axioms are minimal in nature, quasi-independent, and span a universe of discourse. By span, we mean that combinations of axioms, when used according to well defined rules of inference, produce a set of useful compounds; e.g., as theorems are if we are within the domain of pure mathematics.

From such a set a generative structure is produced. Inference rules are applied to produce an inferential field from a set of axioms [REF-17]. But axiom sets have specific formal natures [REF-18]. Unlike the stock-trading rule mentioned earlier, information is passed along in tweets, rapid-fire Q&A, emails, and many other attempts at knowledge transfer.

Categorical analytics, done right, enables computing technology to support adaptive learning. We have only to integrate existing knowledge about formal systems and natural systems. For example, certain logics were produced from our efforts to compress data into knowledge [REF-19]. These logics may be applied to tweets and other linguistic expressions. What is needed is a computing platform that integrates service computing principles with deep learning methodology [REF-20].

Service Oriented Architecture (SOA) [REF-21] is in fact an efficient means to find deep structure related to the various tasks associated with commercial computing. So when we use principles from stratification theory, we are on a path towards a more complete and general framework supporting investigations of any type, including individual study of college coursework.

The neurological theory is simple. Learning when internalized creates long-term changes in the individual's capacity to learn more or to remember clearly. These changes may be positive or negative. A positive experience with deep learning allows the individual to grow more confident and enhance self-image. Self-confidence contributes to long-term retention of topics learned.

Deep learning methodologies like Inquiry Based Learning (IBL) [REF-22] have the potential to be truly transformational. But again, we see the marginalization of these methodologies for social reasons that are complex in nature. One of the issues has to do with cultural beliefs about what is positive and what is negative; and as a consequence, what should be a learning goal. An encoding methodology that induces changes in organizational deep structure is generally positive simply because there is growth in character and capacity.

Deep learning theory proposes to explain why IBL works, and how to make it work in more situations. In particular we are interested in enhancing social resources available to underserved populations. In underserved communities, social and individual deep structure has formed under less than ideal circumstances. Part of our research will be directed at achieving a sense of fairness. Linking neurological models to student learning objectives promises to produce cultural knowledge necessary to achieve fairness.

Creativity and Innovation

A creative process is required to express internalized knowledge. Our proposed social media will enhance creativity by giving viable choices to the individual. Shifting responsibility for learning from the mentor/expert to the mentee/apprentice is reflected in a sense of ownership and renewed capacity to demonstrate knowledge.

When creativity is enhanced, we observe individual transformations in freshman college math classes. Students are given the opportunity to use deep knowledge transfer methods to express new ways of thinking about mathematics and about how to study mathematics. The goal was not to create new mathematics, but to contextualize freshman mathematics within the typical image of self of the individual [REF-23]. ILB methods were used [REF-24].

As deep learning methods are used and are seen to have success in mathematics classes, many students experience increased courage and capacity to excel in other academic disciplines. Similarly, we expect that applying deep structure techniques in the workplace will accelerate a capacity to break down organizational barriers and silos. Today's personal needs include understanding interrelationships among many topics, perspectives, and disciplines. This increase in shared human knowledge creates cultural values and increases competitiveness.

Our perception is that digital encoding of organizational deep structure will be part of global effort to produce value by reducing waste. Deep structure for humans may be regarded as part of what we share via the use of language [REF-25, REF-26]. In a similar fashion the deep structure of organizational systems share commonalities with other systems [REF-27]. One way for research communities to communicate about "non-observables" such as deep structure is through the use of terms like "engrams", or phrases like "replicator mechanisms".

These terms create a folk taxonomy, where agreement facilitates communication specific to one discipline. How we assign meaning to these terms is then a precursor to our creating and communicating about things that can only be represented in an abstract sense.

Categorical Analytics

By definition, computable deep structure is constructed so as to reveal the character of a system. What we perceive directly, when not aided by results from natural science, is not deep structure. We see deep structure only as something inferred using our private or public symbol systems. We do not see the atoms, only the properties of compounds.

Communication and increase in effectiveness might be realized with stratified categorical analytics. Algorithms will find candidates for deep structure representation. Human communities then validate how meaning is assigned to composition from deep structure. We then encode representations about deep structure as computable ontology.

An organization's capacity to perceive and respond to risks and opportunities is essentially the same as the ability to recognize category. A sub-optimal response is observed because the mind system is designed to avoid complexity [REF-28]. Categorical analytics organize information situationally and addresses contextualization issues in real time. Often statistical methods are useful to gathering digital intelligence, but there is a non-statistical part of data analytics; e.g., that of categorical analytics. Here we find an active emerging literature to guide us

This literature includes work on complex, natural, systems. Natural system each have unique deep structure, similar to speech production systems. Our priori work includes systems such as generating gene expression. This work involved encoding gene expression ontology into computable forms [REF-29]. A means to express categorical data in a generative fashion was involved [REF-30]. For example, an enumeration of elements modeling substructure is advanced when an organization has SOA governance policy [REF-31].

The capacity to address novelty comes from human ability to observe external phenomena, break the observation down into its most fundamental components, store them in some way (e.g., as memory), and re-assemble them into a proper situational assessment. This process may be duplicated in Service Oriented Architecture (SOA) governance rules. For course, the re-assembly may be incomplete due to novelty or perhaps due to re-contextualization. This is a concern for both machine intelligence and human intelligence.

Novelty is managed by a different perceptual subsystem than response to expected stimulus [REF-32]. Contextualization also constrains how we understand. Contextualization creates the need to shift from one viewpoint to a different viewpoint. But re-contextualization does not always work perfectly. For example, private experience sometimes creates barriers to learning. Fortunately, the presence of barriers may be associated with specific representation profiles. These profiles may be used to identify which remediation strategies are indicated. Intervention might then be guided by principled research on behavioral categories and remediation strategies based on ontological modeling of complex systems. A rendering of the model we developed is given in the published literature [REF-33].

Applied Research

The full integration of advanced computing regimes with research literatures from neuroscience and psychology is on the horizon. Our conjecture is that a class of replicator mechanisms are responsible for producing behavior. This conjecture will be the subject of continuing research. Gene, and cell signal pathways expression are generative in nature. Thus the developed formalism and computing infrastructure will be useful. Basic research on generative systems is advanced in several ways.

Service oriented computing best practices should regard these invariances as indicators of social engrams [REF-34]. These indicators are to be understood as computed artifacts, and results checked in various ways. A governance policy is required. Governance policy finds, stores, implements and update enumerated taxonomy, with deep structure. Deep structure representation is constructed through computational extraction of invariances across many instances [REF-35]. In our design, computing engines, such as derived from quasi axiomatic theory [REF-36] and J. S. Mills logics [REF-37]; process raw, unfiltered, data structure. The structures are then encoded with a variable encryption that cannot be de-coded except to activate super-distributed messaging [REF-38].

Our opportunity is to develop something that does not now exist. This new type of social media is on the horizon. However, our work would bring a new generation of communication systems into the service of a PhD program in mathematics education. As professors, we talk about how nice it would be to edit computer generated tests, or assign customized homework; and how bad the text book is. But much of this is out of the control of one professor.

One needs general systems theory to see the full picture. For example a textbook adoption may please everyone at first, and then for unexplainable reasons the text is made larger the next edition. When the full picture is given, future product lines are revealed that do not depend on large bulky physical textbooks. Markets will evolve to meet demand, even if this evolution is slow in coming. We are at a turning point. Vendors understand that a lead, for example in adaptive assessment is important. But this type of lead can quickly be surpassed by other vendors. The market needs to step beyond current practices and standards. Again, we give an example. Saying that AI is in the product is easy. But really delivering mature next generation learning environments is still beyond our current technology.

On the other hand, inquiry based methods are difficult to implement in the college classroom, with or without technology. Adaptive assessment, done right, will support social media functions. And so here is the opportunity. Categorical analytics provide the algorithms necessary to support individualized and real time adaptive assessments. Learning will be centered in the physical classroom, but will also be supported secure and open digital communication between professors and students.

What is provided is a match between structure and function. This alignment is accomplished using social media equipped with categorical and statistical analytics, and a super-distributed authoring system. Revenues still flow from content authorship and use statistics.

Design Elements

We are seeking the optimal design of social media supporting college level mathematics education. One necessary objective is that each student internalizes topics being learned. The central question in mathematics education is about this very thing. How to measure learning. How can measurement be encoded into a hash table? What type of service architecture and governance regime is needed so that basic research on both K-14 a college level teaching of mathematics is demonstrated?

The advances in science and technology provide all of the tools we need. Natural science may be employed to tell us how internalization of experience occurs. Our work will focus publishable research on core questions, such as these. The present challenge is to integrate deep learning methods and theory with the theory underlying inquiry based methods. This means the integration of neurology of learning and computing theory.

Why is neurology and immunologic science important? We must remember that there is a long period of individual adaptation, of the image of self, while in math class. There are reasonable grounds for conjecturing regarding neurological changes due to unpleasant experiences.

In the design of a computing platform, for social media, one may use services that parallel the well-known generative architectures from a study of the plant and animal immune systems [REF-39, REF-40]. The integration of this work into computing architecture is possible. Through the use of new social media the classroom experience might be positively changed by providing teachers and instructor a place to guide students outside of the physical classroom.

Computing Platform

In Levine's work the biologically feasible neural network model is "complete", and reviewed in neuroscience journals [REF-41]. Our research would use this model, digital rubrics and a new form of social media to measure learning during the semester that a student is enrolled in a math class.

Some relevant neurological behavior is modeled by a system of differential equations [REF-42]. This mathematical model is based only on first order differential equations, and yet the simulation accounts for an acquired learning disability. We have conjectured that acquired disability results directly from poor experiences while in a math class [REF-43].

But the linkage between neurology and topic level measurement of learning has yet to be completed. The key is found in independent constructions of an ecosystem of computing machines, and in Hadoop supported interaction in near real-time.

Our research platform will encode ontology from data, and persist the ontology without further need for the data. Learning higher mathematics will become natural and expected. Adaptive assessment will move groups, in cohort format, through university enumerated topics, knowledge representational spaces, in a non-linear and individualized fashion. Categorical analytics will assist students to self-organize into temporary learning communities, and disband once everyone has learned. Students themselves will assume increased responsibility for learning.

Deep Learning Architecture

Digital representations about learning may be segmented into a class of highly efficient digital objects. These are used to represent the deep structure of an individual's capacity to learn math. In our architecture, an object oriented inheritance regime is equipped with communicating code that forms a computing backplane [REF-44], useful for encryption regimes. Digital profiles about topics learned and student objectives met have a natural and efficient digital structure [REF-45].

A protected object will be compressed in such a way that reading the object causes the object to communicate to an authority that the object is being read, in real time. This compression is "prove-ably secure", since one knows without uncertainty that the specific data object is secured, or not.

A new type of data security is on the horizon [REF-46]. In our proposed research platform, data security is via the principle of super-distribution [REF-47]. Because of a new generation of super-distributed data security, social media might take on the form of a boarding school. Within a boarding school's legal responsibilities is the responsibility for protecting individual assessment data.

IP Objects

Small digital objects distinctly represent one student's learning profile. The profile will continuously change as the student learns. Learning will be measured and resulting data encoded. Digital Learning Profiles (DLPs) are protected intellectual property objects. This means protected by U. S. federal law. But, of course, this protection has to be enforced, and enforcement is where digital object technology is used.

To achieve a high level of digital control over a class of protected digital objects, we may use some elements from number theory and a hash table [REF-48] and encoded assessment data within Hadoop data repositories on isolated servers. Hash table objects are then attached to a very small virtual machine (less than 47 K), and the whole package encrypted using a highly efficient encoding regime. Mobile Virtual Machines (MVMs) are now well understood, and may follow the design elements of virtual machines like the CoreTalk MVM. The CoreTalk MVM is private IP owned by Sandy Klausner. Another VM is the Java Virtual Machine (JVM) [REF-49].

Regarding Writing to Learn

Our remembrance of past experiences in math class may be encoded into a structural layer in the natural processes supported by neural and immunological processes [REF-50]. Handwriting is necessary to long term transfer of human knowledge [REF-51]. Social media is necessary to create the type of demand that will replace the published textbook. These are all given facts. So what is the solution to today's many challenges in education? What are we attempting to do? What is possible today using social media? Is it possible to embed handwritten message exchanges about mathematical topics? Math Twitter. This is what we are proposing [REF-52].

Many-to-One (M2O) Analytics

Stratified service architecture might be developed to natively support handwritten digital message exchanges (DMEs). Digital messages are composed from decomposed substructure. Thus the computable services that might be developed should be governed by SOA governance principles.

An enumerated set of elements representing substructure is like a periodic table for social media, in real time, and evolves as social units change. This feature allows the use of utility functions and systems theory. Security over private data is also realizable, within a stratified service architecture. The system captures a compressed representation of expression within social media. This representation of expression may then be used to facilitate real time communication between a group of students and a community of professors, or single professor.

Many to one messages will likely soon be a common product of analytics. A community's communications may be observed, but in line with federal laws on privacy, and representations of topics of interests projected as a means to communicate from the many to one person, or a different community. To achieve efficiency, digital message exchanges are married with a process engine designed as a "virtual machine"; like, Java Virtual Machines.

Service Oriented Architecture

Service Oriented Architecture (SOA) is then a natural fit between stratification of behavioral atoms from aggregated services provided by SOA compliant computing platforms. Measurement is generally a difficult problem, and in this case we are attempting to measure learning. We may measure learning through an analysis of the three rubric dimensions; skill, synthesis, evaluative. Adaptive Rubrics (AR) are used to encode invariance in data and link these measures of invariance to predictive analytics. A few other requirements are specified in our architecture.

The topic level measurement of learning must use knowledge space representations. A representation space holds various sets of enumerated topics and student learning outcomes. A designed system for encoding the output of a digital rubric is optimized in our work. The output from a digital rubric may be encoded with a class of encryption algorithms, and shown "provably" impossible to fail. Data security is protected by technology derived directly from what Brad Cox has described in his book on "Super Distribution" [REF-53].

Our research is directed at giving guidance to mathematics education by providing a learning platform where deep learning and adaptive assessment are married. When classroom activities are not working there will be help from outside the classroom. This online help must not isolate the teacher from the experiences of the students. Professors need to be in a communicative loop with all of the students, all of the time. The many to one function assists students communicating as a class, or part of a class section, to the professor.

Technical Summary

Categories may be used to identify and codify according to a rubric. Several methodologies for doing this are available. These methodologies inform the practice of categorical analytics. By definition, meaning is assigned to compounds composed from deep structure and subject to behavior due to induction, non-locality or emergence [REF-54]. Super distribution is one part of our architecture, which is Hadoop based. A platform is designed that will assist individual students develop inquiry into topics of higher mathematics. This platform will serve to produce high quality source data about student learning objectives, and behaviors arising out of past experiences in math class.


[REF-1] Paul Stephen Prueitt is on the faculty supporting Mathematics Education at Texas State University and lifelong researcher into using deep structure methods. He may be reached at

[REF-2] Katzan, Harry Jr. (2008) Categorical Analytics Based On Consensus Theory appeared in Journal of Business & Economics Research (JBER) Vol 8 Issue 8

[REF-3] Katzan, H. (1992), Managing Uncertainty: A Pragmatic Approach, New York: Van Nostrand Reinhold Co

[REF-4] Murray, Arthur (2012) From personal discussion.

[REF-5] Levine, D. & Prueitt, P.S. (1989.) Modeling Some Effects of Frontal Lobe Damage - Novelty and Preservation, Neural Networks, 2, 103-116.

[REF-6] Prueitt, Paul Stephen in Service Technology Magazine – (July 2013) Service Oriented Architecture and Data Mining: a step towards a Cognitive Media?

[REF-7] Prueitt, Paul S. (1995a) A Theory of Process Compartments in Biological and Ecological Systems. In the Proceedings of IEEE Workshop on Architectures for Semiotic Modeling and Situation Analysis in Large Complex Systems; August 27-29, Monterey, Ca, USA; Organizers: J. Albus, A. Meystel, D. Pospelov, T. Reader

[REF-8] References on this are extensive. Please consult the wiki page:

[REF-9] Bayes, Thomas, and Price, Richard (1763). "An Essay towards solving a Problem in the Doctrine of Chance. By the late Rev. Mr. Bayes, communicated by Mr. Price, in a letter to John Canton, A. M. F. R. S.". Philosophical Transactions of the Royal Society of London 53 (0): 370–418.doi:10.1098/rstl.1763.0053.

[REF-10] Dempster, A.P. (1967), Upper and Lower Probabilities Induced by a Multivalued Mapping, The Annals of Statistics 28:325-339.

[REF-11] Shafer, G. (1976), A Mathematical Theory of Evidence, Princeton, NJ: Princeton University Press

[REF-14] Blackmore, Susan, 1999, The Meme Machine, Oxford University Press, Oxford, ISBN 0-19-850365-2

[REF-15] The remembered self and the experienced self as not the same thing.

[REF-16] They participate as atomic elements whose realities are governed by a location within a compound.

[REF-17] Finn, Victor (1991). Plausible Inferences and Reliable Reasoning. Journal of Soviet Mathematics, Plenum Publ. Cor. Vol. 56, N1 pp. 2201-2248

[REF-18] Prueitt, Paul Stephen (1997) The simplest form of the J.S. Mill's logic, Invited presentation at VINITI in Moscow Russia.

[REF-19] Prueitt, P.S. (1995) A Theory of Process Compartments in Biological and Ecological Systems. In the Proceedings of IEEE Workshop on Architectures for Semiotic Modeling and Situation Analysis in Large Complex Systems; August 27-29, Monterey, Ca, USA; Organizers: J. Albus, A. Meystel, D. Pospelov, T. Reader

[REF-20] Prueitt, P. (1998). An Interpretation of the Logic of J. S. Mill, in IEEE Joint Conference on the Science and Technology of Intelligent Systems, Sept. 1998.

[REF-21] Erl, Thomas (2008) Service-Oriented Architecture: A Field Guide to Integrating XML and Web Services, Prentice Hall


[REF-23] Bandura, A. (1997). Self-efficacy: The exercise of control (see article). New York: Worth Publishers.

[REF-24] Prueitt, Paul Stephen (2013) Stratification Theory and Computing, Annual R L Moore conference

[REF-25] Whorf, Benjamin (1956), John B. Carroll (ed.), ed., Language, Thought, and Reality: Selected Writings of Benjamin Lee Whorf, MIT Press

[REF-26] Chomsky, Noam (1965). Aspects of the Theory of Syntax. Cambridge, MA: MIT Press.ISBN 978-0-262-53007-1.

[REF-27] Bertalanffy, Ludwig von (1968). General System Theory. New York , George Braziller Publisher

[REF-28] Prueitt, Paul S. (1996a) Optimality and Options in the Context of Behavioral Choice, in D. S. Levine & W. R. Elsberry, Eds. Optimality in Biological and Artificial Networks?, Erlbaum, 1996.

[REF-29] Prueitt, Paul Stephen (on going) Unpublished work on gene expression ontology. References provided on request to

[REF-30] Prueitt, Paul S. (1993.) Network Models in Behavioral and Computational Neuroscience, invited chapter In Nonanimal Models in Biomedical & Psychological Research, Testing and Education, New Gloucester: PsyETA.5

[REF-31] Erl, Thomas (2012) Governing Shared Services On-Premise & in the Cloud Prentice Hall

[REF-32] Levine, D. & Prueitt, P.S. (1989.) Modeling Some Effects of Frontal Lobe Damage - Novelty and Preservation, Neural Networks, 2, 103-116.

[REF-33] Levine, Daniel and Prueitt, Paul Stephen (1992) Simulation of conditioned preservation and novelty preference from frontal lobe damage. In Michael L Commons, Stephen Grossberg, and John E R Staddon, editors. Neural Network Models of Conditioning and Action, chapter 5 pages 123 – 147 Erlbaum Associates

[REF-34] Prueitt, Paul Stephen (2011) Social media and categorical analytics, Blueberry Brain Conference

[REF-35] Prueitt, Paul Stephen (1998) Shallow link analytics, scatter gather and parcelation methods. (reference)

[REF-36] Prueitt, Paul S. (1994.) System Needs, Chaos and Choice in Machine Intelligence. Chaos Theory in Psychology (A. Gilgen and F. Abrams, Eds.) Contributions in Psychology Series. Westport, Conn.

[REF-37] Prueitt, P. (1997). Quasi Axiomatic Theory, represented in the simplest form as a Voting Procedure. Presented in Moscow at a conference held at VINTI, and published in All Russian Workshop in Applied Semiotics, Moscow, Russia. (Translated into Russian and published in VINITI Conference Proceedings.)

[REF-38] Prueitt, Paul Stephen (ongoing) Unpublished work on elementary number base invariances and encrypted hash table formats.

[REF-39] Eisenfeld, J. & Prueitt, P.S. (1988.) Systemic Approach to Modeling Immune Response. Proc. Santa Fe Institute on Theoretical Immunology. (A. Perelson, ed.) Addison-Wesley, Reading, Massachusetts.

[REF-40] Prueitt, Paul Stephen (1988) Dissertation, University of Texas at Arlington.

[REF-41] Levine, D. S. (2012). Neural dynamics of affect, gist, probability, and choice. Cognitive Systems Research, 15-16, 57-72.

[REF-42] Prueitt, Paul Stephen (1988) Dissertation.

[REF-43] Prueitt, Paul Stephen (1988) Discrete Formalisms and the Ensemble Modeling of the Instructional Process, published in the proceedings of The Fifth International Conference On Technology and Education, Edinburgh, Scotland, March 1988

[REF-44] Prueitt, Paul Stephen (1988) Mathematical Models of Learning in Biological Systems, PhD Dissertation University of Texas at Arlington.

[REF-45] Topic representation is encoded as RDF triples, using widely known ontology representation standards.

[REF-46] There is more to say about this, regarding knowing that information that once existed in digital form is no longer in a digital form. It is possible to see patterns of data in the digital world from the message exchanges flowing through digital networks.

[REF-47] Cox, Brad (1991) Object Oriented Programming: An Evolutionary Approach. Addison Wesley. 1991. ISBN 0-201-54834-8.

[REF-48] Hash tables are a simpler technology then database management systems.

[REF-49] Other innovations are involved in deep learning architectural designs. Prueitt, Paul Stephen (1988) Mathematical Models of Learning in Biological Systems, PhD Dissertation University of Texas at Arlington.

[REF-51] James, KH & Atwood, TP (2008). The role of sensori-motor learning in the perception of letter-like forms: tracking the causesof neural specialization for letters. Cognitive Neuropsychology.

[REF-52] Prueitt. Paul Stephen (June 23, 2014) Individually Directed Inquiry R L Moore Legacy Conference 2014, DenverCO. Presentation available on U Tube

[REF-53] Cox, Brad (19961) Superdistribution: Objects as Property on the Electronic Frontier. Addison Wesley. 1996.ISBN 0-201-50208-9.

[REF-54] Kugler , P.N. & Turvey, M.T. (1987.) Information, natural law, and the self-assembly of rhythmic movements. Hillsdale, NJ:LEA.