In the Spirit of Collegial Inquiry...

updated: 11 Apr 98

Intelligence: Meaning and Measurements

DM:   ..A topic which I would find interesting is a short explanation of Sternberg's triarchic theory of intelligence, to which I admit ignorance. I understand that Sternberg does not advocate multiple intelligence factors, as Gardner does, so I don't know what triarchic means. The reason I ask is because I list Collegium** as a " multiple intelligence " society, which may not be entirely true.

JCC:   He's been writing in the field for over 15 years... I'm hoping to find an affordable used copy of Successful Intelligence and Defying the Crowd: Cultivating Creativity in a Culture of Conformity. Sternberg's model of intelligence strikes me as rather similar to Gardner's, but restricted to three axes: Analytic, Practical, and Creative/Synthetic. Sternberg considers further subsets within each of these three in his more comprehensive tests. The Analytic " pillar " is most similar to conventional IQ. Gardner's system is sevenfold, if memory serves, separating out athletic and musical abilities, etc. Both are really widening the conventional view of what qualities are considered under the umbrella of intelligence. With luck I've not introduced much distortion in the basic picture through oversimplification. So-called common sense and the emotional intelligence reflected in interpersonal skills are among the traits gaining recognition as part of the total picture of mental maturity.

DM:   Julia, is it appropriate for me to use the email list to discuss topics such as ... [history and theoretical basis of intelligence testing] ?

JCC:   An excellent idea I think! If you've no objections, I'm formatting some of these discussions, with minor editing, as a part of the web pages, saving them for future members. They will probably be of general interest, but topics here need not be restricted to mental measurement.

GAG:   I read your recent message with much concern. Where are you hearing that Sternberg's approach is NOT multi-intelligence based?

DM:   As I understand it, Sternberg does use the word "intelligence" in multiple places, i.e., analytical intelligence, creative intelligence, and practical intelligence. But his usage is different from Gardner's usage, who seems to use the term " intelligence " as a synonym for " factor. " That is, Gardner does not subscribe to the idea of a single factor (g) that is responsible for a variety of cognitive abilities, instead arguing for multiple factors (" intelligences "). Sternberg, on the other hand, does admit that factor analysis of traditional IQ tests point toward the existence of g. However, he argues that g is not the only important aspect of human cognition. This much I gathered from opening one of Sternberg's books and glancing at a random page.

The dilution of terms like " intelligence " and " IQ " is an aspect of this field that I find irritating and confusing. For example, IQ refers both to a number calculated as " mental age divided by chronological age " and as a number that is proportional to the standard deviation score of a testee. Worse yet, using the latter definition of IQ, there is no agreed-upon number of IQ points per standard deviation.

JCC:   My feeling is that Sternberg's views hold a reasonable middle ground between two polarities on whether intelligence is basically a unitary entity (the traditional " g " factor) or a cluster of abilities having relatively little interrelation. It might be suggested that the issue is not a " real " one but dependent upon point of view. There is not a universal agreement on the boundaries that might be drawn between intelligence, creativity, talents, and aptitudes. Gardner's model encompasses such a wide perspective that many of these " intelligences " cannot be correlated well enough to be treated as a single thing. The more traditional models, descended from Binet's work in the early years of the century, were more restricted in the scope of abilities considered as intelligence. Vocabulary items, verbal analogies, synonyms, antonyms, general knowledge ... such subtest scores tend to be reliable pointers of the full-scale IQ. Number series, varieties of logical and arithmetic problems, figure analogies, and occasionally short-term memory items are other frequent mainstays of tests. Some of these correlate less well with the construct being measured. In the history of testing, "g" came to be viewed as a composite of "crystallized" and "fluid" intelligence. The mind, seeking to analyze itself, stumbles against its own process of defining the terms rigorously, as if measuring the shadows cast by lights placed at various angles. My description cannot be doing justice to nearly a century of research, but does not surprise me that the very existence of intelligence is questioned anew. We bemoan its absence, yet find definition elusive because we know this quality only by inference rather than by any direct measurement of brain activity.

The mental/chronological age ratio is from the early history, not meaningful for adults of course. This gave the quotient system its original rationale prior to statistical foundation. Z-score (or similar representation) avoids the ambiguity of IQ scores, but there appears so much neurotic fixation on the latter that the convention may take a long time to change. I rather like the [tripartite notation of] IEQ which Greg has used on score reports. Again, it's something of a middle ground which may prove constructive.

CML:   Greetings... I was just accepted for membership in Collegium**. If the discussion in my e-mail is any indication of the usual level of discourse, I think I'm going to enjoy myself.

As some of you know, I publish the Eastern Edition of Noesis, the journal of the Mega Society***. The Mega Society has recently had a problem with the classification of its untimed, unsupervised, and insanely difficult entrance exams, necessitating a consideration of how different kinds of test may furnish different measures of intelligence. Specifically, we've had to turn our attention to the distinction between timed low-complexity tests and untimed high-complexity tests.

Anyone familiar with the elementary theory of computation knows that these kinds of tests differ in two key computational parameters bearing a well-defined nonlinear mathematical relationship in the context of any specific computative device: computation space, denoting the number of processing or memory elements involved in a computation, and computation time, denoting the number of steps or clock-cycles from input to output. Combined with a complexity analysis of various problems and algorithms, this relationship has certain definite implications for the functionability of various categories of device.

Although the human brain transcends our current mechanical knowledge of computation, it can be generally categorized as a kind of parallel distributed processor called a "neural network". Thus, although we don't know the exact details of its structure and function, we have a general computational model to which it would seem to conform. The mathematical description of this PDP model contains certain general invariants in terms of which human intelligence can be tentatively analysed.

Preliminary analysis shows that intelligence, while in principle equating to a general intelligence factor g, naturally assumes certain physical characteristics of the device which serves as its vehicle. Although these characteristics are linked in standard parametric relationships, they are to some extent independent. This constrains the interpretation of psychometric statistics involving different tasks and different kinds of test.

As we all know, the science of statistics furnishes a means of gathering data pertinent to the construction of a cause-and-effect model representing the phenomenon under empirical study. However, there are limits to this benefit. At some point, the tables turn, and a model is necessary for the meaningful interpretation of statistics. We have now reached that point in the study of human intelligence. Future issues of Noesis ... will contain a development of these concepts.

For now, it will suffice to note that the evolution of the definition of "IQ" from "mental age/chronological age x 100" to a number based on standard deviation has taken psychometrics from a primitive model to no model at all, and that this leaves the interpretation of psychometric statistics in a vacuum. The first step in filling that vacuum is to begin formulating distinctions in terms of which a new model can be constructed. The distinction between IQ and IEQ**** is an excellent start.

JCC:   Chris, thank you for an excellent initial letter! I'll force myself to be uncharacteristically brief, in the interest of forwarding this promptly to the list. You've done justice to the core problems of psychometrics. Noting that " ...intelligence, while in principle equating to a general intelligence factor g, naturally assumes certain physical characteristics of the device which serves as its vehicle... ", it may well be that the vehicle is the device in the case of the human brain. Doubtless our perspectives will change with greater comprehension of the physical process of thought.

Is the distinction between thinking and feeling not an arbitrary one, dating back to mythological views centering emotions in the heart? There seems still a great deal of subjectivity in just what processes are subsumed under the umbrella of intelligence. I do agree that the concepts of Gardner and Sternberg offer a constructive advance, manifest here in our preference for a multifaceted index of IEQ or Ability Quotients. We are really beginning to acknowledge the importance of creative and interpersonal abilities ... obvious but neglected widely in the past.

CW:   Interesting discussion regarding processing constraints and intelligence. One factor that must be included with the notion of iteration is the probability that there will be a repetition of a given step. For a computer the likelihood it will follow the instruction in a program is almost 1; for a human this likelihood is very much less than one. Even in programming models such as neural nets (where there are generic instructions and the computer program creates a set of pseudo "instructions" based on "learning" or acquired data based on past performance) the physical ability of a computer to precisely reproduce a given action is very different from a brain's ability to do the same. The computer however is typically not designed to survive even a small breakdown or to repair itself - both of which a brain can do to a limited extent.

A brain seems to record information differently in that it is the connections between cells and not the state of any given cell at any given time that is the method of storage and retrieval. Furthermore this is not an all or nothing thing, rather an increase or decrease in the probability that a given cell will fire if a neighbouring cell has fired. IQ tests therefore are very much a black box analysis of learning. What the test creator believes is relevant to intelligence will very much influence the " intelligence " tested. But ultimately why someone got a higher score or a lower score is not known. One very innovative test put out in the early 1980s simply rated you on your ability to define what intelligence meant relative to your culture. It was comparable to school performance in predicting IQ.

The other problem with IQ tests (I won't talk about g since it is an abstraction of an already abstract measure - maybe it should stand for " grail " ) is that they don't tell us what we can do to improve our performance. So I think that at some point the concept of IQ will be outmoded by more exact knowledge that will give people more choice to modify what they have. In some sense depression could have similarities to neural conditions that create " low IQ " . Inefficiency in processing in certain areas leads to emotional frustration independent of processing in other areas. I experience this all the time since my personal life lags well behind my intellectual life. I am worried about some personal problem turn on the computer or go to do/think about something of a more practical nature and poof the " problem " is gone - until I think about it again, of course.

Anyhow, to summarize and propose a hypothesis - increasing the rate of change of probability of neural connections will increase intelligence. I think that could be a testable hypothesis?

CML:   Regarding your hypothesis: first, behavioral dependencies among neurons are mediated by physical connections with hard-wired and soft-wired aspects. The weights (or resistances) of these neural connections change - i.e., the connectivity matrix is transformed - according to a "learning function". As the weights of hard-wired connections change with respect to the firing thresholds of various neurons, individual neurons change their behavior, and the system follows suit.

CW:   can you recommend any references relating to this? (resistance changes of neural connections)

CML:   For a general model, try the paper " A General Framework for Parallel Distributed Processing " by Rumelhart, Hinton and McClelland. For an entry-level discussion of various specific " neural " devices, try the book, Naturally Intelligent Systems by Caudill and Butler.

CW:   To clarify, I was not attempting a comprehensive definition of intelligence but rather thought that changing the ability to change the strength of existing neural connections will affect performance on IQ tests. Of course there are other factors, one of which is the raw speed at which a signal can be transmitted through a neuron, the energy consumption of neurons relative to available food energy, the ability of a neuron to fire repeatedly without failing and if a given area of the brain "wears out" temporarily to what extent can its function be taken over by other parts.

CML:   Again, you're talking about a change in the learning function, which in its most comprehensive sense includes the biological constraints inhibiting changes of connective strength. Because IQ tests always call for a certain amount of learning, IQ test performance might very well rise with a faster learning rate. The last two factors you refer to are saturation and redundancy. Neurons do indeed saturate - how to inhibit saturation is an interesting question - and PDP systems in general have high potential redundancy. The D stands for " distributed ", mathspeak for " redundant " . Real learning occurs when this process effects a beneficial adaptation enhancing the system's ability to internally represent and react to input. Accelerating the rate of adaptative change of neural connections therefore equates to an acceleration of learning. This, however, implies a necessary increase in neither processing power nor speed. So your hypothesis, rather than requiring an empirical test, stands or falls on the extent to which you equate learning, as opposed to raw processing potential, with " intelligence " .

CW:   I don't know what you mean by real here. Certainly what you can learn is not necessarily always beneficial. An interesting aspect of learning that we tend to forget is that we never stop doing it whether we like it or not. Our brains in their activity are very much a dynamic system.

CML:   I mean real as opposed to delusory. Most psychologists attempt to distinguish learning from intelligence. For example, achievement tests are distinguished from intelligence tests in design and interpretation. But while psychologists tend to act as though they can be separated in principle, they cannot be completely separated in practice. This sort of confusion is only natural in the absence of a comprehensive logical framework in which to relate mental characteristics like learning and intelligence.

CW:   In Society of Mind Minsky talks about intelligence as being the ability to learn how to learn.

CML:   Although Minsky is a bit of a dinosaur - it's been many years since his revolutionary critique of the perceptron - he still comes up with a real gem now and then.

CW:   I guess what I am trying to say is that you can't really separate learning and raw processing power. Isn't it the case that every time a neural pathway is used the strength of the connections changes?

CML:   Yes. Usually, the learning function modifies strength according to frequency of usage. However, this is not necessarily the case. For example, the learning function might be designed to counteract the effects of bad data or misinformation.

CW:   It would be very unusual to see a computer that changed every time it was used in the same way that the brain does. One problem is that emotional learning may well be permanent and any unlearning would be necessarily working around this.

CML:   This is interesting in light of a theory, propounded several years ago by Dr. Crick of DNA fame, that the purpose of dreaming is unlearning ... checking current data for consistency with established knowledge and erasing false associations.

CW:   The Chemistry of Conscious States by Dr Alan Hobson has some interesting ideas relating to this as well. One problem with analyzing dreams is that when you become aware during them they change fundamentally. They could also be something similar to phantom limb pain in which an unused area of the brain remains activated when it would be better off inactive.

CW:   Is it necessarily the case that intelligence is based on computation?

CML:   Intelligence functions on multiple levels. On the very highest philosophical level, it can be described as a manifestation of pure will. However, the formal definition of computation - information transduction - is so general that virtually nothing of interest can occur without it.

CW:   So you are not specifically referring to step by step algorithmic processes specifically?

CML:   Well, yes I am...but in a much more general sense of the term algorithm than is usually meant. In this new usage, an algorithm is any set of deterministic or nondeterministic cognitive invariants governing the transformation of information by cognition. This is what it takes to relate cognition and computation. In the CTMU, the master algorithm of human cognition is called the HCS, or Human Cognitive Syntax.

JCC:   This is emerging, as hoped, into a truly fine discussion! Indulge please a small flight of fancy... suppose that you had a few computers, off the shelf, of a new " plastic " design chip, capable of growing to me the needs of tasks given to each by the specific work environment. Oh, some of the chips might start out with different inclinations based on origin, say for composition and playback of music, or for complex number crunching ... another for indexing of extensive text data by keywords, etc.

The critical thing is that the systems can build themselves up as needed for the tasks to which inclined, partly by initial design (nature, model class, inheritance of a sort) and partly by the kind of setting in which they begin daily development. The engineer returns in eighteen months to check up, performing a large series of different benchmark tests on each of the systems, in order to check out performance, possible dysfunctions, and perhaps offer advice to the customers on any changes they might need to make in the external power, basic silicon and " vitamin " supply, and regimens of daily use.

Well, these different computers would have grown far more specialized by this time, and while they could be relocated to new tasks, basic specialty grooves will have developed. What kind of performance tests could possibly cover all the aspects of such " plastic computers " ?

CW:   Good question in the sense that it is probably impossible to predict what they have learned without prior knowledge of the learning tasks involved. Also, this is more realistic than you might think as there are chips that can have programmed connections (to facilitate chip design) although this is not done on the fly like you suggest. In any case, if you don't mind waiting longer for the results such experiments could be done with software alone. The one probably impossible to solve problem is finding previously unknown abilities via standardized tests. Which could include the ability to do known tasks more efficiently or thoroughly as well as to create new tasks relating to more adaptive objectives.

Actually the easy out might be to get the computer to devise the test for you. In fact their task could be set to be devising tests that the other machines flunk while they work until they pass any test that they receive. Being given the test would be simply opening the loop. And of course you could test the "can god create a rock so heavy that even he/she couldn't lift it?" by short circuiting the loop and connecting the computer to itself. {That's the} definition of a very expensive barbeque.

CML:   A logical description of the purpose of each unit would be combined with:
(1) a complete logical description of the unit's environment, and
(2) a logical description of the unit's current behavior under all possible relevant inputs from that environment
The second composite description would then be compared to the first using a measure of abstract information accounting for the frequencies of various inputs. High- frequency inputs define the specialty grooves to which you refer. This would provide a measure of efficiency relative to purpose ... i.e., relative to the most specific invariant level of programming. Evaluating and correcting dysfunctions would require, in addition, an exact knowledge of the unit's current structure, and the adaptative rule of structural evolution expressed as a function of both structure and environment.

The test for any given unit would thus consist of a series of tasks ranging over all input equivalency classes defined relative to the unit's purpose and structure, i.e., input-to-output pathways.

By the way, can I charge for this?

JCC:   Probably, if we can locate a suitably large capacitor. {grin} Seriously, a fascinating exposition!

CW:   Another question that comes to mind: is thinking (consciously) relevant to intelligence or can we consider it to be a side effect?

JCC:   I've often considered that what we call consciousness may be precisely a side-effect of the " hardware " ,particularly in that so much of what we claim as higher brain activity looks a lot like the juggling of more or less well-honed subroutines. In brief, that's only saying that my penchant is to define thinking in such a wide scope to encompass the mentation of animals and our more sophisticated systems of hardware-plus-software. mind:brain::software:hardware, where the software has been partly learned as a basis for its own extension. Perhaps I'm just playing around with definitions of things intangible, but it seems to me that this describes what we are, and what our systems could grow to become ... yes, like any other virgin birth, to bring the cloning thread into this. {smile}

** Colloquy began 6 Jan 98 as the online component of a society then known as Collegium.

*** The Mega Society seeks members at the 99.9999 percentile, accepting scores from unsupervised untimed tests as its basis. See our Links to Sister Societies.

**** Intellectual Efficacy Quantitator, implying a wider conception of intelligence than that typically encompassed by standard measures.

Return to Colloquy main page