Trends Part 2

Analysis Preceding Theory

Scientific research typically begins with theory. Growing out of philosophical inquiry, scientific method requires the researcher to begin by engaging in the essence of philosophical thinking, conceptual analysis. This process of examining and defining constructs culminates in the postulation of relationships and the design of a means of acquiring empirical data to test the theory. Analysis is driven by the theory under investigation. Now data gathering techniques are such that researchers can be faced with databases so large and complex that analysis must begin without a coherent theory in place. The process of investigating data acquired and stored by a computer is now essentially the same as that of investigating the data acquired and stored by the mind—the sifting, sorting, categorizing, and classifying, with constructs that modify and mutate as the analysis proceeds.

Data mining. Data mining is, in a sense, a computer-based “grounded theory” analysis approach. Rather than the scientific, verification-based approach (establish theory and use analysis to verify it), it is a discovery approach to analysis (hence the use of the term, Knowledge Discovery in Databases). “Data mining is a set of techniques used in an automated approach to exhaustively explore and bring to the surface complex relationships in very large datasets” (Moxon, 1996, p. 1). Although most frequently applied to largely tabular databases, “these techniques can be, have been, and will be applied to other data representations, including spatial data domains, text-based domains, and multimedia (image) domains” (Moxon, 1996, p. 1). The computer programs that do the data mining use algorithms that can examine many multidimensional data relationships concurrently and highlight dominant or exceptional ones. In non-technical terms, the data mining computer programs employ a form of Kelly’s (1955) Psychology of Personal Constructs. The computer begins with a set of rather simple predictor constructs. As it attempts to handle the flow of data with these constructs, anomalies reshape the constructs in the process. These newly mutated constructs are then tested against the data. In this way the computer “learns” and refines constructs very much like people do.

Data mining analysis focuses on a variety of possible productive courses to make sense of the data including: association, sequence, clustering, classification, estimation, case-based reasoning, fuzzy logic, genetic algorithms, and fractal-based transforms. Problems in data mining (Moxon, 1996, p 4) stem from: (1) its susceptibility to “dirty” data—database errors due to magnitude and multiple source input; (2) inability to "explain" results in human terms—do not fit simple if-then terms; and (3) the data representation gap—problems in combining data forms from different computers and database types.

Database sources of potential research value for music education are increasing. “Big Brother” governments, financial institutions, and Internet companies are collecting vast amounts of data potentially relevant to aspects of music in society. These include government census, national survey and assessment data (Burke, 1997), public and private institutional data banks and data archives (Anderson & Rosier, 1997), and commercial data. For example, and other music web sites offer their “free” service in exchange for user profile data and then track all music and music-video use. They invite users to create preference lists, to rate individual songs, etc. This information is stored in a database (now called a data warehouse). Access to government census databases is relatively easy. Access to commercial data may be more difficult. However, the data acquisition is constant; knowledge development is the responsibility of researchers.

Exploratory Data Analysis. EDA, first defined by Tukey (1977), is, in a sense, a manual form of data mining. In the strict scientific method, all analyses are for verification, based on hypotheses articulated before data gathering. Scientific analysis takes its inferential validity from this approach. Consequently, " ‘playing around’ with data is not ‘good’ science, not replicable, and perhaps fraudulent. Many experienced researchers pay lip service to this view, while surreptitiously employing ad hoc exploratory procedures they have learned are essential to research" (Leinhardt & Leinhardt, 1997, p 519). EDA is gaining greater importance because desktop computer analysis makes “snooping” easier, and there is more recognition that finding pattern in nature, the purpose of EDA, is inherently a subjective enterprise. “Exploratory analyses incorporate the wisdom, skill, and intuition of the investigator into the experiment” (Palmer, 2000, p. 2). As an analysis method EDA blatantly recognizes the "constructive" dimension of scientific research—searching does go on in research. EDA is particularly necessary in educational research because often data is gathered on a hunch, intuition, or just because it can be gathered and might be useful.

The analyses deemed “exploratory” now often include various dimensional analyses (correspondence, canonical, principal components, smallest space, multidimensional scaling, configuration comparison, factor scoring, metric dimensional scaling) as well as non-hierarchical and hierarchical cluster analyses.
All have a pre-theory or theory formulation function. For statistically rigorous analysis, EDA cannot be combined with confirmatory analysis on the same data set. One way to accommodate this is to acquire a large enough data set to allow its division into an exploratory subset and a confirmatory subset.

Theory-Confirming Analysis

The current mind-set in research is increasingly accepting exploratory analysis as not only permissible but necessary. Confirmatory analysis, however, is the natural follow-up. Exploration leads to theory which then must be subjected to confirmatory procedures. Most traditional parametric and non-parametric statistical analyses serve this purpose (Asmus & Radocy, 1992). Statistical-based modeling is particularly powerful in theory confirmation and Asmus and Radocy (1992) identify path analysis and linear structural relations analysis (latent trait modeling) as important approaches. They observe that, “Many in the field of music have claimed that a variety of important musical concepts are simply unmeasurable. Latent trait modeling provides a means of accounting for these "unmeasurable" concepts in complex systems” (Asmus & Radocy, 1992, p.165).

“A prominent theme in methodological criticism of educational research during the 1980s was the failure of many quantitative studies to attend to the heirarchical, multilevel character of much educational field research data.... Perhaps more profound than such technical difficulties, however, is the impoverishment of conceptualization which single-level models encourage” (Raudenbush & Bryk, 1997, p. 549). An important analytic response to this problem is hierarchical linear modeling also referred to as multilevel linear models, random coefficient models, and complex covariance components models. Effective explanations of hierarchical linear modeling are given by Goldstein (1987), Bock (1989), Raudenbush and Willms (1991) and Bryk and Raudenbush, (1992).

Exploration—Confirmation Dynamic

Although technically and ideally the analyses described under the previous two headings precede theory or confirm theory, most analysis realistically is a constructive, dynamic process in which there is some inter-play between exploration and explanation, prediction and confirmation. An example of such an approach is Q-methodology (McKeon & Thomas, 1988). Although not strictly a form of analysis, it is a technique that has real potential for exploring intra-personal relations and as such it is a “qualitative” technique. Q-methodology also lends itself easily to theory confirmation. As an analytic technique it drew serious criticism from strict statisticians, but its blending of qualitative and statistical aspects is causing a resurgence of use in preference-oriented research in political science and psychology.

Repertory Grid Technique is another approach that interacts with analysis to explore and explain. It is designed to carry out effectively the process of trying to find out how people view the world from their own perspective (analysis of the person’s own internal representation of the world). A variety of mathematical analyses can be conducted (Fansella & Bannister, 1977), but recent developments in concept mapping and dual scaling have potential with repertory grids as well.

Event History Analysis, also known as survival analysis or hazard modeling, is an approach to exploring whether and when events occur. For example, a researcher might explore when a music student learns a particular skill or when people stop playing their band instruments. An analytic challenge is the problem that some people may not have developed the musical skill at the time the researcher is studying it or this skill may never develop. Event history analysis provides a sound mathematical basis for dealing with such anomalies through predictive modeling (Singer & Willett, 1991; Willett & Singer, 1991).

A battery of new techniques has developed for qualitative data analysis. Since qualitative research often focuses on a “grounded theory” approach, these analytic methods tend to contribute both to exploration and confirmation. Contingency Table Analysis, Correspondence Analysis, and Configural Frequency Analysis can be subsumed under Dual Scaling described above. Galois Lattice or G-lattice is a graphic method of examining and representing knowledge structures with in small groups (Ander, Joó, and Merö, 1997). Social Network Analysis is an approach that allows for the examination of complex (multiplex, meaning multistranded) social relationship networks through mathematical models instead of simple sociograms (Scott, 1991).

Trend 6: Knowledge Representation Complexity

The earlier sections on Types of Data and Roles of Data explained many aspects of representation. Here focus is primarily on the way knowledge is represented to others in a process of communicating the results of a research study. As a basic premise, research is the systematic development of knowledge. Knowledge, like intelligence, exists in multiple forms or types. Knowledge, therefore, can be represented in multiple forms. In the past a particular type of knowledge has been favored by “research culture”; propositional written representation of the researcher’s understanding has been the “privileged representation.” Even at conferences where oral presentations were made, the basic mode of presentation has been to “read the paper.” Numeric forms have been a staple of research. Other forms of linguistic and non-linguistic representation are now emerging as research presentations.

The forms of knowledge representation emerging as research communication can be typified as fitting on related continua: hermeneutically-closed to hermeneutically-open, propositional to non-propositional, discursive to non-discursive. One of the common aspects of these continua is the perception of interpretive control. Propositional, discursive, hermeneutically-closed communication, mainly in written form, is generally perceived as most interpretively controlled by the researcher. Consisting of a linear series of assertions and logical argumentative support, such communication attempts to limit interpretive range, the possibilities of inference. Non-propositional or non-discursive forms are more hermeneutically open, assuming an interpretive, knowledge constructing role for the receiver of the knowledge communication. Commitment to either extreme of a positivist or constructivist ontology leads to a commitment as well on the representation continua.
Recently there has been a growing shift toward constructivist ontology. The result is a trend toward greater use of non-propositional and non-discursive forms of representation of research knowledge.

Written Linguistic Forms.

The way language is used to construct thoughts and ideas varies in different forms of literature. Within written language there is a range of hermeneutic control, from propositional argument to artistic, non-propositional forms. A poem carries meaning or represents knowledge in a way quite different from a laboratory experiment report or philosophical argument. But, language has a central role in constructing or mediating the construction of our internal representations of reality. Postman (1999) says “language is a major factor in producing our perceptions, judgments, knowledge, and institutions” (p. 70). Postman in this context quotes Einstein saying, “The only justification for our concepts and systems of concepts is that they serve to represent the complex of our experiences; beyond this they have no legitimacy” (p.70). A problem with language, however, is how to represent “the complex of our experience.” Since the eighteenth century, “scientific” writers have argued or assumed that our experience of reality is best and fully communicated in expository, propositional prose. In the twentieth century “almost every field of scholarship—including psychology, linguistics, sociology, and medicine—was infused with an understanding of the problematic relationship of language to reality” (Postman, 1999, p. 71). Despite extreme post-modernists’ criticisms of language to represent reality accurately, “we cannot experience reality bare. We encounter it through a system of codes (language, mathematics, art).... They [the postmodernists] mean to disabuse us of linguistic naivete, to urge us to take account how our codes do their work and enforce their authority” (Postman, 1999, p. 71). We must, however, distinguish among “our codes” as to how they carry meaning. To read poetry as propositional argument is to miss its real meaning. To read every sentence of a story as literal “truth” is probably to miss the real meaning of the story. Each form of literature carries meaning in a particular way. If we restrict ourselves to expository prose, we restrict the kinds of meanings we can take from “research,” limit our knowledge, limit our communication power, and distort understanding.

Expository prose. The expository, propositional essay remains one of the most effective ways of communicating a researcher’s knowledge to others. One of the changes in the past ten years, however, has been in the “voice” of such writing. There is now increasing recognition that all knowledge is someone’s knowledge rather than objective truth. With this recognition comes a need for ownership. Consequently, use of first person is replacing the air of depersonalized, scientific objectivity associated with third person expressions. The austerity of expository prose is now being recognized as a form of rhetoric and is often moderated with the inclusion of informal speech transcription from interview or observational data, personal journal notes, or other forms of language. Philosophy and criticism are a special form of argumentative propositional literature.

History. According to Barzun & Graff (1992) in their classic book, The Modern Researcher, all research reporting is writing history. However, there is a form of literature we recognize as history. It is in essence telling a story in propositional form, consciously interpretive particularly when it is analytic, thematic history. Biography frequently attempts to be thematic and analytic but also to capture aspects of the person’s personality and character through enlivened stories. In the form of autobiography or personal diary, which is gaining a place within research methods like personal narrative, the text can be less propositional and more reflectively poetic (Karsemeyer, 2000, Kernohan, 2005).

Story. Since ancient times, stories, myths, tales, and epics have been used to communicate ideas, truths, and knowledge essential to the cultures that gave rise to them. For example, the parables in the New Testament, clearly have meaning. The perceived meaning may vary from person to person, yet the general point is powerfully understood. That same point could not be made as powerfully with a propositional essay. Barone (1983), Adler (2002) and others have demonstrated that constructing knowledge gained from research as a story communicates aspects of knowledge beyond the propositional.

Poetry. The “poetic” nature of language resides in the way meaning constructs are accessed. In propositional statements, the specific form of a construct, the precise meaning to be made out of a particular word, is controlled by the syntax and context. In poetic expressions the constructs, with all their potential multiple meanings are activated. New links between constructs are explored by putting words next to each other that might not ordinarily be thought of together, thereby requiring a re-examination of each construct’s contents. The non-discursive nature of poetic language results potentially in both richer connotative, affective meaning and less denotative meaning. The lack of contextual meaning specification and the juxtaposition of constructs can thereby make conscious links, such as emotional memories, not accessed in the more controlled form of propositional discourse. Although relatively rare at this point, there are signs that researchers are exploring these hermeneutically open forms of poetry (Adler, 2002; Leggo, 2001; Kernohan, 2005), prosetry (Andrews, 2000), or metaphorically rich reflections (Denton, 1996). The “hermeneutically open” research report still allows the research reporter some meaning-making power by the very selection of words, metaphors, images, anecdotes, narratives, or interview quotes. However, what meaning is made from the words and images selected is entrusted to the reader.

Drama (as script). The dialogic interaction of questionnaire and respondent, of interviewer and interviewee, or observed and observer, naturally evokes the essence of drama. Dramatic interaction is not, however, simply text followed by text; it is “dramatic” in the sense that the text is enlivened with additional meaning conveyed by inflection, vocal tone, pacing, gesture, facial expression, and so on. The transcription of an interview does not capture these layers of meaning communicated by the person interviewed. “Stage directions” written into the “script” may convey some of these meaning dimensions as the drama stimulates imagination and is recreated in the mind of the reader. Consequently, researchers are beginning to write at least parts of research reports in this form. Reynolds (1996) included the script for a radio play as preface to the dissertation. Lamb (1991, 1994), Baskwill, J. (2001), O’Toole (1994) and Adler (2002) reported significant dimensions of their research as a drama script. At times participants in research presentations are enlisted to enact these scripts as reader theatre. Vitale (2002) expressed the core of his findings in a movie script. The challenge for the researcher attempting to communicate additional meaning through written dramatic dialogue is first the selection of a potent transcript. More importantly, as writing style, the challenge is described by Van Manen (1990). To be a good dramatic script we want “language that lets itself be spoken and used as thought” (p. 32). However, what is to be represented, as fully and richly as possible, is still only the knowledge discovered, supported, or clarified in the intentional research process. To go beyond that risks the imposition of personal interpretive agendas.

Spoken Language

An oral research report carries a dimension of meaning not present in a written report. Non-discursive, supra-segmental sounds (Farahani, F., Panayiotis G. Georgiou, P.G,, & Narayanan, S.S. 2004) that we hear as vocal inflection, emphasis, and tone indicate importance, certainty, excitement, etc. in specific parts of the report. In addition, gesture and facial expression carry meaning that clarifies and enhances understanding. These dimensions are important since communication of meaning is enhanced if multiple representations (linguistic, gestural, tonal, visual) are employed. However, the persuasiveness and appeal of the presenter may affect the acceptance of the research. There is a research culture, and norms within that culture may favor a specific style of presentation. For example, in The Little Prince (Saint-Exupery, 1943), the Turkish astronomer makes a discovery and “On making his discovery, the astronomer had presented it to the International Astronomical Congress, in a great demonstration. But he was in Turkish costume, and so nobody would believe what he said. Grown-ups are like that… “(p. 17). John Kenneth Galbraith (quoted in Smithrim, K., Upitis, R., Meban, M., & Patteson, A. 2000) observes, “There are a significant number of learned men and women who hold that any successful effort to make ideas lively, intelligible, and interesting is a manifestation of deficient scholarship. This is the fortress behind which the minimally coherent regularly find refuge.” There is a trend toward livelier, more multi-representational communications of research knowledge.

Numeric Representations

Numbers are an important form of representation in research reports, but one that communicates more than information about the phenomenon the numbers are representing. For many people in our society, statistical numbers mean “scientific” research and the credibility of any attendant assertions rises. At the same time, for many the numbers are baffling and intimidating, with the result that few of those who might benefit from the knowledge intended to be communicated actually understand and receive the communication. Unfortunately, the group that sees numbers as “scientific” includes not only readers of research but also some researchers who have minimal understanding of the numbers they employ. Numbers can be misleading, especially when they are separated from what they represent. A recent trend in research reporting, especially in the popular press, is describing the effect, of a new treatment for example, as percent improvement or likelihood, rather than in terms of significance or actual probabilities. For example, Altman (1999) reports in the New York Times that a new drug produces a 76% reduction in breast cancer. In actual fact, one percent of the study’s participants who took a placebo over three years got breast cancer, while one quarter of one percent taking the drug got the disease. The use of the “76% improvement” sounds impressive but does not communicate importance or significance of the finding and becomes highly misleading (Smallwood, 2000, p. 17).

Graphic Representations

Graphic representations of research knowledge are designed to communicate information through the complex visual processing system. In the process of communicating, we can entertain, persuade, inform, or mislead. Wilkenson (1988) argues that “Many designers of quantitative graphics confuse these functions or subordinate informing to other goals. Sometimes this is intentional, as in graphic propaganda, but often it is inadvertent, as in popular newspaper graphs which distort their message with bright colors and ‘perspective’ views” (p. 61). Because computer statistics and graphics programs make it easy, and almost the only choice, researchers increasingly use three dimensional bar graphs and pie charts (e.g., Asmus, 1994, pp. 18-21). In every case, the three dimensional perspective makes the graphic more “entertaining” but less clear and, therefore, less informative. For example, in the bar graphs Asmus (1994) presents, it is very difficult to determine to what point on the vertical axis the bar corresponds. There is no doubt about the potential for graphic representation to enhance the communication of knowledge in both written and oral presentations. However, we must resist trendy “entertainment” and choose clear information.

“To envision information—and what bright splendid visions can result—is to work at the intersection of image, word, number, art” (Tufte, 1990, p. 9). But the challenge in this process of “envisioning information” stems from the fact that “the world is complex, dynamic, multidimensional; the paper is static, flat. How are we to represent the rich visual world of experience and measurement on mere flatland?” (Tufte, 1990, p. 9). The use of three-dimensional perspective in graphs is not the answer. The multi-dimensionality lies in meaning, not in perspective. The “flatland” is not simply the paper or video screen surface but the simplicity of representation. The escape from the “flatland” is through “progress of methods for enhancing density, complexity, dimensionality, and even sometimes beauty” (Tufte, 1990, p. 33).

Artistic Representation

Musical. Although music is an artistic, non-propositional phenomenon, music education researchers have been slow to embrace the trend in educational research toward arts-based, or arts-informed inquiry. The problem for music is essentially our inability to see how making music could be an expression of knowledge gained through research. The key is in the research question asked. If the data required to answer the question are musical, then presenting samples of data can inform the communication of knowledge gained. In fact, the musical data, as representations of the phenomenon being studied, are the only way to communicate a dimension of meaning involved. For example, if the researcher is asking what musical decisions expert conductors make to balance musical perfection and the limited abilities of musicians, the best data would be actual rehearsal events. Once the researcher “knows” the answer, the communication of that knowledge would be through the researcher demonstrating the same ability with an ensemble. That might be a research report at a research conference. Music as an additional layer of meaning in a research presentation has been used and will be discussed under combination forms.

Theatre. Drama scripts were described as a form of representation earlier. Staging an actual theatrical production is the next step. Just as an oral report enhances a written paper, an acted drama is more powerful than a written one. Just as actually performing music requires a different level of understanding than merely talking about or analyzing music, so “acting” the new knowledge about a teacher role, a student’s struggle, or a parent’s dilemma, requires and presents a different level of insight. The act of communicating in a dramatic way also embodies in a holistic manner a representation of knowledge. An example of such a dramatic production was “Hong Kong, Canada” written by the researcher, Tara Goldstein, (2000). The script was “based on four years of ethnographic research in a multilingual, multiracial school” (Program Notes), and was refined as a result of the feedback from two workshop readings of the play by graduate students in the researcher’s classes. The staged, one-act play was a representation of the knowledge gained in the research.

Visual Images. A four-page photo essay in a magazine may be more powerful in communicating elements of a phenomenon than a 40 page written essay might be. At least, the photo essay would communicate something unique and different from the written essay. For this reason researchers now may include visual images as a data source and as integral part of a research report (Neilsen, 2001). Methodology like visual anthropology (Banks, 1998) and visual sociology (Harper, 1998) is developing. Karsemeyer (2000) used one photograph to represent the central meaning of her dissertation on dance. Woolley (2000) included a photo essay as representation of children’s engagement. Huberman (2004) used both photographs and children’s drawings as data and as a means of reporting her findings. Illustration instead of photograph could function in the same way.

Combination forms

Many of the previously mentioned modes of representation are used within the context of another, e.g., numbers in a propositional essay or graphs in an oral report. There are possibilities that combine more modes, particularly the artistic.

Documentary The video documentary is a form being explored in video journals like The Video Journal of Education published by Gale. The potential of this medium is to combine researcher comment (in the propositional argument form) with other oral presentation, graphs, actual research participant comment, dramatic episodes, photographic images, and illustrations, with background or interlude music. The modes of representation could encompass all types of intelligence and knowledge. The visual is a substantial part of the documentary. “Photographs get meaning, like all cultural objects, from their context” (Becker, 1998, p 88). The documentary form offers a rich, multifaceted context, controlled to a great extent by the researcher/documentary maker. As such it has powerful potential for the communication of complex knowledge. An oral history of a significant leader or a case study of a school program might be the core of such a documentary, but the potential is for great creative flexibility. An example from sociology is the documentary, “You’re Blind”: Black athletes and Ads (Harrison, 2001).

Paper Performance. Performance texts, described by Denzin (1997) as “poems, scripts, short stories, dramas that are read, and performed before audiences” (p 179) may border on performance art or a monologue drama but involve parallel representational “performances.” An example is the paper/performance delivered in several academic contexts entitled, “Dorothy troubles Musicland” (Lamb, 1997). In this presentation, Lamb read a formal, expository essay in parallel with a series of recordings from Tchaikovsky to current alternative rock groups, while transforming her appearance step-by-step by the removal and addition of clothing and ornamentation, from wool suited professor to “leather dyke.” The two non-linguistic modes of representation were illustrative of the propositions of the essay, serving interactively in the meaning construction of the audience. They were parallel communications, each adding their own set of meanings, but also creating a reflexive whole greater than its parts.

Researcher Bias

An important consideration in all forms of representation is researcher bias. The term “bias” however, is a problem. Traditional scientific researchers may take “bias” to mean an expression of the researcher’s voice in the report, evidence of the researcher’s personal construct-oriented interpretation, or research that involves data gathering through intentional experience. However, obscuring the researcher’s voice may create bias, all research has someone’s interpretation, and all research data gathering is done through a construct structure, regardless of the form of representation or the “fixing” method. Bias is more a matter of researcher integrity. Bias in representation is a form of error resulting from the researcher’s mis-perception of phenomenal features due to existing construct rigidity, lack of fidelity to a consensus of perception, or a deliberate mis-“representation” of knowledge due to conscious or unconscious allegiance to a pre-existing construct. Traditional scientific researchers believed that adherence to methodological formulae and descriptive propositional reporting of results would prevent bias. What was not recognized was that the “objective,” “scientific,” “truth” aura that clung to such reports could communicate a “bias” (an erroneous representation of knowledge) despite the researcher’s best intentions. Further, the limited range of questions that could be explored with these “scientific” methods indicated a bias (or ontological commitment) in the very choice of methodology. Since the possibility of mis-perception, mis-representation, or failure to develop a broad enough consensus (among instruments or perceivers), is inherent in all research, describing the intentionality of data acquisition, personalizing the knowledge claims, and acknowledging the interpretive act are essential criteria for researcher integrity.

The most researcher-controlled forms of representation of knowledge are the written expository essay and its oral presentation. Within the restricted range of knowledge communicated by propositional language and to the extent that language can encode meaning in the least ambiguous way, such research reports minimize potential misinterpretation and unrecognized bias. When the researcher chooses a form of representation that is less propositional, there is more complex meaning – more levels or layers of meaning drawing on more types of intelligence (more hot buttons on the web site being accessed). The potential for misinterpretation (or the construction of a personal interpretation) rises as well, leading those with an ontological commitment to objective truth that can be discovered through empirical evidence and logical reasoning quite worried. If knowledge is recognized as constructed and personal, and that research is ultimately about knowledge development, then research-derived, knowledge representations that serve as stimuli to personal knowledge construction should be clear but potent. Multiple forms of representation have the best chance of altering the receiver’s knowledge structure because each form contributes a dimension the others lack. As Babbie (1995) observed about research orientations, “each…offers insights that others lack -- but ignores aspects … other paradigms [representations in this case] reveal" (p. 41).

Although multiple forms of representation contribute dimensions of meaning, the forms are not equal in communicative power. Marshall McLuhan gave us the concept of a hot medium (television) versus a cool medium (print). We recognized now that it is more than media. The anecdote (story) is a particularly powerful form. Shenk (1997) states that “anecdotage is a particular problem in the context of today’s media age. ‘With the sophisticated mental apparatus we have used to build world eminence as a species,’ Robert Cialdini says of this catch-22, ‘we have created an environment so complex, fast-paced, and information-laden that we must increasingly deal with it in the fashion of the animals we long ago transcended’ ” (p. 159). According to Shenk we deal with life’s complexity by clinging to simple stories. Social psychologists show how meeting a particular case (e.g., a brutal prison guard) can over-shadow considerable statistical data about the general category (e.g., all prison guards) (Shenk, 1997, pp 156-157). Researchers selecting story and other artistic forms of representation must be fully aware of the power of their selected communication medium, exercise caution, and demonstrate integrity in how they shape the meaning of the communication. Shenk (1997) expresses the caution: “beware stories that dissolve all complexity” (p. 157).

Trend 7: Dissemination Complexity

Research dissemination is about facilitating knowledge development in others. Frequently researchers believe, or at least act as if, dissemination is about “putting the information out there” and others may read and try to understand if motivated. However, communication can be more active and more reflexive—researchers have a responsibility for developing knowledge in others.

There are two important issues related to dissemination of research-derived knowledge: In whom should knowledge be developed and how. Until recently, new research-derived knowledge was primarily communicated to other academic researchers. Researchers have lamented the lack of application of research by practitioners, yet have lived in, and perpetuated a system that does not value practical dissemination. Now practitioners and the general public are coming to expect that research findings may inform aspects of their daily lives from diet choices to what music they play for their babies.

In whom is knowledge to be developed. The groups to whom research knowledge should be communicated are now more numerous than they were even a decade ago. These now include: (1) Researchers in music education; (2) Researchers in other disciplines; (3) Users of research. Researchers and their university establishments, must learn to value direct communication with teachers though in-service workshops, demonstrations, and the development of pedagogical resources, as highly as they value refereed publication; (4) Advocacy groups; (5) The music industry. Music education researchers may be uncomfortable with the industrial R&D approach, but often there is “product” development potential in research. Conflict may, of course, arise between research “objectivity” and desired outcomes in research intentionally conducted for advocacy groups or proprietary purposes; (6) The public.

To communicate effectively with teachers, advocacy groups, the music industry, and the general public the researcher needs to take a “public relations” view. One important requirement is the elimination of jargon. Although we may be dealing with complex phenomena and ideas, understanding starts with simplification rather than complexification. This may be through a potent visual image, story, or specific case example. The temptation is to sensationalize and thereby distort, as well as to allow bias to enter the communication. Although the researcher targets communication to specific groups, what is communicated must be the result of real research.

How is knowledge to be developed. “How” to communicate research knowledge depends to some extent on “to whom” it is to be communicated, but, today there are several ways to reach almost every target population. Three variables influence the choice of dissemination medium: time, cost, and form of representation required. The effect of electronic media on time is obvious. The need for face-to-face meeting and discussion versus electronic forms has the largest bearing on cost. The form the knowledge representation takes greatly affects the choice of medium for dissemination. When knowledge was represented only in static visual graphics or text, print publication was efficient. The recognition that knowledge exists in gesture, image, or sound requires alternative forms of knowledge communication. Knowledge performances require “in person” presentation or must be captured on video. Combining text, talk, music, and photographed or computer animated video images, can be done by the new media forms like DVD, CD-ROM, or Internet-based web productions. Since there is a trend in music education research toward multiple forms of representation of research knowledge, more complex media will be employed in dissemination.

An example of the “public relations” approach to research dissemination is described in detail by Smithrim, Upitis, Meban, and Patteson (2000). The premise of the paper is that researchers must go “public or perish.” Efforts to communicate to their “publics,” have included: reports at nine academic conferences, papers for refereed journals, interviews on morning radio, photographs in the university newspaper, a story for the local paper, an exhibit of children’s art work from the project, articles in the participant research school newspapers, creation of a website, a video describing the project, and a glossy brochure to accompany press releases or for conference distribution. The researchers became advocates for their own research project.


This chapter has essentially been about ontology and epistemology, reality and knowledge, external and internal representation. I have argued that reality is complex but exists apart from representations of it, and that knowledge about reality is constructed through a process of prediction, verification, or accommodation. The mind is engaged in a process of acquiring data related to reality and, from these data, constructing knowledge in a systematic way. Research is the same process, only formalized. Since all the senses provide data for the knowledge constructing process, knowledge exists in multiple forms. If knowledge exists internally in multiple forms, it is most accurately represented externally in multiple ways.

The general trend I have explored has a twofold manifestation: (1) phenomena in which researchers are interested are increasing in complexity; and (2) there is increasing recognition that all human behaviors and characteristics are inherently complex. I have argued this means that mental constructs associated with all human behavior and expression must be regarded as inter-related and multifaceted, and that theoretical research constructs must reflect this complexity. What follows from this is that data have multiple meanings and connections, yet do not represent fully the associated phenomena. To address complex constructs, given the limitation of data, a variety of research methods must be employed. Multiple data and varied method lead to complex analyses to accommodate multiple levels, hierarchical structures, nonlinear relations, and non-propositional, non-discursive data.

Although not directly applying complexity theory (chaos theory), I have purposefully aligned some of my rhetoric with it. One of the most important implications of complexity theory is that researchers will never be able to make a complete description and will never be able to completely predict phenomena. This is implied by Cronbach’s (1975) “hall of mirrors” (p. 119) or empty “first cells before we had half the battery completed” (p. 123).

Can research exist without a drive to predict and control? Is research not about theory, theory not about predicting, and predicting not about controlling? The hope of researchers has been to find that “other elusive variable,” another interaction, a more detailed path analysis. We know that students do not simply drop out of band because their aptitude is low and they are assigned to an instrument for which they do not like the timbre. But, we believe if we also account for intelligence, teacher behavior, home support, socio-economic status, early music experiences we will be closer. And, we assume one day we could account for 100% of the variance and predict who will stay and who will drop out?

The prospect of not being able to discover answers or even partial answers to many of our problems and challenges is probably frightening to music education researchers. As mentioned earlier, each researcher functions in a research culture. Ours is one influenced by the fact that we are musicians, educators, and researchers. Musicians in the western European tradition are masters of replicative music-making in which minute control of every muscle, pitch, rhythm, timbre, nuance is practiced through hours and hours of repetitive, disciplined effort. Most challenges must be solved before a successful performance is realized. As educators, we function in a milieu where the normal need for class control is amplified by the noisiness of our art and our love of large group performance. The public performance pressure makes efficiency and learning management paramount. As researchers we live in an era of rational science where the central purpose is prediction and control.

So how can we respond to the challenge of complexity? We can draw on our unique strengths as music education researchers—persistence, discipline, and creative thinking. But, in addition we need to be more rigorous in conceptual analysis to understand the constructs we encounter. We must be ingenious in identifying the form of data that most validly represent the phenomena we value. We must be flexible and open-minded in selecting methods to provide reliable observations. We must be disciplined in analysis to transcend the easy answer, the known analytic technique, or the simple solution. We must be daring, confident, and willing to engage the people who matter to music education in the development of knowledge.

In the face of complexity, we must also realize that we are not going to solve the whole puzzle and find every answer. As a result we may ask different questions. Although we cannot manage the whole system, we are a part of the system and need to realize how we are connected to others. We may ask questions that help us understand another’s perspective, another’s plight, another’s joy. Second, we can change from a perspective of only finding problems to which we can match solutions to one where we see beauty in life’s chaos, and describe that beauty. Third, we must engage in “lateral thinking”— encountering the complexity to find imaginative new solutions. Fourth, we must believe in small, local efforts that can have global results. A common metaphor in chaos theory is the assertion that a butterfly flapping its wings in China can affect the weather in New York. Although our power is limited, we are in a dynamic system and we do have influence. Finally, we must learn to discover, attend to, and appreciate life’s rich subtleties.


Adler, A. (2000).

Ander, C., Joó, A., & Merö, L. (1997). Galois Lattices. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 543-549). Tarrytown, New York: Pergamon Elsevier Science Inc.

Anderson, J. & Rosier, M.J. (1997). Data banks and data archives. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 344-349). Tarrytown, New York: Pergamon Elsevier Science Inc.

Andrews, B. (2000). Land of Shadows. Language and Literacy: A Canadian Educational E-Journal, 2(1), <>

Asmus, E. P. & Radocy, R. E. (1992). Quantitative analysis. In R. Colwell (Ed.), Handbook of research in music teaching and learning (pp, 141-183). New York: Schirmer Books.

Babbie, E.R. (1995). The practice of social research (7th Ed.). Belmont, CA: Wadsworth Publishing

Barrowman, N. (1998). A survey of meta-analysis. Dalhousie University. <>

Bartel, L. (2000). A foundation for research in the effects of sound on brain and body functions. In H. Jorgensen (Ed.), Challenges in Music Education Research and Practice for a New Millennium (pp. 58-64). Oslo, Norway: Norges musikkhogskole.

Barone, T. (1983). Things of use and things of beauty: The story of the Swain County High School Arts Program. Daedalus, 112(3), 1-28.

Birge, R. T. (1932). The calculation of errors by the method of least squares. Physical Review. 40. 207-227

Bock, R.D. (ed.) (1989). Multilevel analysis of educational research. New York: Academic Press.

Briggs, J. & Peat, D. (1999). Seven life lessons of chaos. New York: HarperCollins Publishers.

Bryk, A.S. & Raudenbush, S.W. (1992). Hierarchical linear models: Applications and data-analysis methods. Beverly Hills, CA: Sage.

Burke, G. (1997). Census and national survey data. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 323-327). Tarrytown, New York: Pergamon Elsevier Science Inc.

Byrne, D. (2000). Complexity theory and social research. Social Research Update, 18. p 1-7.

Cameron, L. & Bartel, L. (2000). Engage or disengage: An inquiry into lasting response to music teaching. Orbit, 31(1), 22-25.

Chou, S. L. (1992). Research methods in psychology: A primer. Calgary, AB: Detselig Enterprises Ltd.

Cochran, W.G. (1937). Problems arising in analysis of a series of similar experiments. Journal of the Royal Statistical Society (Supplement) 4. 102-118.

Cotes, R. (1722). Aestimatio Errorum in Mixta Mathesi, per Variatones Partium Trianguli Plani et Sphaerici. Part of Cote's Opera Miscellanea, published with Harmonia Mensurarum, ed. R. Smith. Cambridge.

Cronbach, L. (1957). The two disciplines of scientific psychology, American Psychologist, 12, 671-684.

Cronbach, L. (1971). Test validation. In R. L. Thorndike (Ed.), Educational Measurement, (2nd Ed.). Washington: American Council on Education.

Cronbach, L. (1975). Beyond the to disciplines of scientific psychology. American Psychologist, 30, 116-127.

Cronbach, L. (1982). Prudent aspirations for for social inquiry. In W. Kruskal (Ed.), The social sciences: Their nature and lines (pp. 61-82). Chicago: University of Chicago Press.

Cziko, G. A. (1989). Unpredictability and indeterminism in human behavior: Arguments and implications for educational research. Educational Researcher, 18(3). 17-25.

Csikszentmihalyi, M. (1993). The evolving self: A psychology for the third millennium. New York: HarperCollins Publishers, Inc.

Davis, R., Shrobe, H., & Szolovits, P. (1993). What is a knowledge representation. AI Magazine, 14(1), 17-33.

Denton, D. (1996). In the tenderness of stone: A poetics of the heart. (Doctoral dissertation, Ontario Institute for Studies in Education, University of Toronto, 1996). Dissertation Abstracts International, 58(06), 2085.

Dolloff, L. (1999). Imagining Ourselves as Teachers: The Development of Teacher Identity in Music Teacher Education. Music Education Research, 1(2), 191-207

Donmoyer, R. (1990). Generalizability and the single-case study. In E.W. Eisner & A. Peshkin (Eds.). Qualitative inquiry in education: The continuing debate (pp. 175-200). New York: Teachers College Press.

DuMouchel, W. (1990). Bayesian meta-analysis. In D. A. Berry, (Ed.) Statistical methodology in the pharmaceutical sciences. New York: Dekker.

Efron, B. (1996). Empirical Bayes methods for combining likelihoods. Journal of the American Statistical Association. 91, 538-565.

Egger, M. & Smith G. D. (1997). Meta-analysis: Potentials and promise. British Medical Journal, 315, 1371-1374.

Fansella, F. & Bannister, D. (1977). A manual for repertory grid techniques. London: Academic Press.

Fine, A. (1998). Scientific realism and antirealism. In E. Craig (ed.), Routledge Encycopedia of Philosophy Online.

Fisher, R.A. (1932). Statistical methods for research workers (4th edition). London: Oliver and Boyd.

Gabrielsson, A. (1991). Experiencing music. Canadian Journal of Research in Music Education, 33 (ISME Research Edition), 21-26.

Gardner, H. (1999). Intelligence Reframed. New York, N.Y.: Basic Books.

Glass, G.V. (1976). Primary, secondary, and meta-analysis. Educational Researcher, 5: 3-8.

Glass, G.V., McGraw, B., & Smith, M.L. (1981). Meta-analysis in social research. Beverley Hills, CA: Sage.

Goldstein, H. (1987). Multilevel models in educational and social research. Oxford: Oxford University Press.

Goldstein, T. (1999). Program Notes for “Hong Kong, Canada.” University of Toronto.

Haig, B.D. (1997). Feminist research methodology. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 180-185). Tarrytown, New York: Pergamon Elsevier Science Inc.

Hayles, N. K., (1991). Chaos and order. Chicago: University of Chicago Press.

Homans, G. C. (1974). Social behavior: Its elementary forms. New York: Harcourt Brace Janovich.

Hunter, J.E., Schmidt, F.L. & Jackson, G.B. (1982). Meta-analysis: Cumulating research findings across studies. Beverley Hills, CA: Sage.

Ignatieff, M. (2000). The rights revolution. Toronto: House of Anansi Press Ltd.

Jacques, B. (2000). Abuse and persistence: Why do they do it? Canadian Music Educator, 42(2), 8-13.

Jutai, J., Knox, R., Rumney, P., Gates, R., Wit, V. & Bartel, L. (1997, May). Musical training methods to improve recovery of attention and memory following head injury. Paper presented at the Executive Function and Developmental Psychopathology: Theory and Application Conference, University of Toronto.

Kafre Systems International, (1998). On eliminating risk by managing complexity. Technical Reports KSI-TN-100102.<>

Karsemeyer, J. (2000). Moved by the spirit: A narrative inquiry. Unpublished doctoral dissertation, University of Toronto, Toronto.

Keeves, J.P. (1997a). Introduction: Advances in measurement in education. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 705-712). Tarrytown, New York: Pergamon Elsevier Science Inc.

Keeves, J.P. (1997b). Introduction: Methods and processes in educational research. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 277-285). Tarrytown, New York: Pergamon Elsevier Science Inc.

Kelly, G. (1955).The psychology of personal constructs. 2 vols, New York: Norton.

Lamb, R. (1997a, February). "Dorothy troubles Musicland," Paper presented at the Faculty of Music, University of Toronto.

Lamb, R. (1997b). Music trouble: Desire, discourse, and the pedagogy project. Canadian
University Music Review, 18(1), 84-98.

Lamb, R. (1998, May). Mentoring: Master teacher/student apprentice as pedagogy in music. Paper presented at the Canadian University Music Society, University of Ottawa.

Lamb, R. (1999). "I never really thought about it": Master/apprentice as pedagogy in
music. In K. Armatage (Ed.), Equity and How to Get It: Rescuing Graduate Studies (pp.213-238). Toronto: Inanna Publications and Education Inc.

Lawson, M.J., (1997). Concept mapping. In J. P. Keeves, (Editor), Educational Research Methodology, and Measurement: An International Handbook. Second Edition. Tarrytown, New York: Pergamon. Elsevier Science Inc., 290-296.

Leinhardt, G & Leinhardt S. (1997). Exploratory data analysis. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 519-528). Tarrytown, New York: Pergamon Elsevier Science Inc.

Light, R.J. & Pillemer, D.B. (1984). Summing up: The science of reviewing research. Cambridge, MA: Harvard University Press.

Linacre, J. M. (1995). Rasch measurement: Construct clear concepts and useful numbers! University of Chicago, <>.

Lincoln, Y.S. & Guba, E.G. (1984). Naturalistic Inquiry. Beverly Hills, California: Sage.

McCarthy, M. (1997). The foundations of sociology in American music education, 1900-1935. In R. Rideout (Ed.). On the sociology of music education (pp. 71-80). Norman, OK: School of Music, University of Oklahoma.

McCarthy, M. (2000). Music Matters: A philosophical foundation for a sociology of music education. Bulletin of the Council for Research in Music Education, 144, 3-9.

McGaw, B. (1997). Meta-analysis. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 371-380). Tarrytown, New York: Pergamon Elsevier Science Inc.

McKeon, B. & Thomas, D. (1988). Q methodology. Newbury Park, CA: Sage.

Moxon, B. (1996). The Hows and Whys of Data Mining, and How It Differs From Other Analytical Techniques. DBMS Data Warehouse Supplement, August.

Nishisato, S. & Nishisato, I. (1984). An introduction to dual scaling. MicroStats.

Nishisato, S. (1980). Analysis of categorical data: dual scaling and its applications. Toronto: University of Toronto Press.

Nishisato, S. (1994). Elements of dual scaling: An introduction to practical data analysis. Hillsdale, NJ: Lawrence-Erlbaum.

Novak, J. (1990). Concept mapping: A useful device for science education. Journal of Research in Science Teaching, 27(10), 937-949.

O’Toole, P. A. (1994). Redirecting the choral classroom: A feminist poststructural analysis of power relations within three choral classrooms. Dissertation Abstracts International, 55(07), 1864. (University Microfilms No. AAT 9426965)

Palmer, M. (2000). Hypothesis-driven and exploratory data analysis. <>

Paul, S.J. (2000). The sociological foundations of David Elliott’s “Music Matters” philosophy. Bulletin of the Council for Research in Music Education, 144, 11-20.

Pearson, K. (1904). Report on certain enteric fever inoculations. British Medical Journal. 2. 1243-1246.

Perlmutter, M. L. & Perkins, D. N. (1982). A model of aesthetic response. In S. S. Madeja & D.N. Perkins (Eds.), A model for aesthetic response in the arts (pp. 1-29). St. Louis, MO: CEMREL, Inc.

Raudenbush, S.W. & Bryk, A.S. (1997). Heirarchical linear modeling. In J.P. Keeves (Ed.), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 549-556). Tarrytown, New York: Pergamon. Elsevier Science Inc.

Raudenbush, S.W. & Willms, J.D. (1991). Schools, classrooms, and pupils: International studies of schooling from a multilevel perspective. New York: Academic Press.

Rosenthal, R. (1991). Meta-analytic procedures for social research, Revised Ed. Newbury Park, CA: Sage.

Reynolds, J. L. (1996). Wind and song: Sound and freedom in musical creativity. (Doctoral dissertation, University of Toronto, 1996). Dissertation Abstracts International, 58(06), 2127.

Rothe, J.P. (1993). Qualitative research: A Practical guide. Toronto: PDE Publications.

Scott, J. (1991). Social network analysis: A handbook. Newbury Park, CA: Sage.

Sharpely, C.F., (1997). Single case research: Measuring change. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 451-455). Tarrytown, New York: Pergamon Elsevier Science Inc.

Shenk, D. (1997). Data fog: Surviving the information glut. New York: HarperEdge.

Shulman, L. (1988). Disciplines of inquiry in education: An overview. In R.M. Jaeger (Ed.), Complementary Methods for Research in Education (pp. 3-17). Washington, D.C.: American Educational Research Association.

Singer, J.D. & Willett, J.B. (1991). Modeling the days of our lives: Using survival analysis when designing and analyzing longitudinal studies of duration and the timing of events. Psychological Bulletin, 110(2). 268-298.

Smith, T. C., Spiegelhalter, D. J., & Thomas, A. (1995). Bayesian approaches to random-effects meta-analysis: a comparative study. Statistics in Medicine. 14, 2685-2699.

Smithrim, K., Upitis, R., Meban, M., & Patteson, A. (2000). Get public or perish. Language and Literacy: A Canadian Educational E-Journal, 2(1), <>

Stake, R., Bresler, L. & Mabry, L. (1991). Custom & Cherishing: The arts in elementary schools. Urbana-Champaign, IL: Council for Research in Music Education.

Standley, J. M . (1996). A meta-analysis on the effects of music as reinforcement for education/therapy objectives. Journal of Research in Music Education 44(2). 105-133

Sundin, B. (1989). Early music memories and socialization. Canadian Journal of Research in Music Education, 30(2), 154-161.

Thorndike, E. L. (1910). The contribution of psychology to education. Journal of Educational Psychology, 1, 5-12.

Tippett, L. H. C. (1931). The methods of statistics. London: Williams & Norgate.

Trochim, W. (1989a). An introduction to concept mapping for planning and evaluation. Evaluation and Program Planning, 12, 1-16.

Trochim, W. (1989b). Concept mapping: Soft science or hard art? Evaluation and Program Planning, 12, 87-110.

Tukey, J. W. (1977). Exploratory data analysis. Reading, MA: Addison-Wesley.

Wachter, K.W., & Straf, M.L. (eds.) (1990). The future of meta-analysis. New York: Russell Sage Foundation.

Walker, J.C. & Evers, C.W. (1997). Research in Education: Epistomological Issues. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 22-31). Tarrytown, New York: Pergamon Elsevier Science Inc.

Willett, J. B. & Sayer, A. G. (1994). Using covariance structure analysis to detect correlates and predictors of change. Psychological Bulletin, 116. 363-381.

Willett, J. B. & Sayer, A. G. (1995). Cross-domain analyses of change over time: Combing growth modeling and covariance structure analysis. In G.A. Marcoulides & R.E. Schumaker (Eds). Advanced structural equation modeling: Issues and techniques. Hillsdale, NJ: Lawrence Erlbaum Inc.

Willett, J. B. & Singer, J. D. (1991). From whether to when: New methods for studying student dropout and teacher attrition. Review of Educational Research, 61(4). 407-450.

Woolley. J. C. (2000). Making connections: Pre-reading reader response and mother-research from one family's perspective. Unpublished doctoral dissertation, University of Toronto, Toronto.

Yates, F. & W. G. Cochran. (1938). The analysis of groups of experiments. Journal of Agricultural Science. 28. 556-580.

Zeller, R.A. (1997). Validity. In J.P. Keeves (Ed), Educational Research Methodology, and Measurement: An International Handbook (2nd ed., pp. 822-829). Tarrytown, New York: Pergamon Elsevier Science Inc.